Steve Ives edited this page Oct 23, 2018 · 14 revisions

Harmony Core Logo

DevPartner 2018 Workshop


This document details the steps necessary to complete the Harmony Core Workshop that was presented during the post-conference event at the DevPartner 2018 conference.

The workshop consists of the following modules, which should be completed in the order listed.

Required Software

Back to top

To complete the workshop, and to undertake Harmony Core development generally, you will need the following:

Preparing Your Environment

Back to top

Verifying your .NET Core SDK Version

The first thing we need to do is verify the version of the .NET Core SDK that you have installed. We require at least version 2.1:

  1. Verify that your computer is connected to the Internet.

  2. Open a Command Prompt window and type the following command:

    dotnet --version

If you don’t get a response, or if you see a version number lower than 2.1 then you don’t have the required SDK and you must remedy that. You can download the .NET Core SDK from

Create and Trust an HTTPS Developer Certificate

We’ll be using HTTPS to access the services that we create, so we’ll be needing a trusted SSL certificate. Fortunately .NET Core now has the tools needed to both create a certificate, and make the certificate trusted on the local machine.

  1. In the command prompt window, execute this command to generate an SSL certificate:
    dotnet dev-certs https

You may see a message saying “A valid HTTPS certificate is already present”. If so, no problem, you already have a certificate.

  1. Next use the following command to register the certificate as a trusted certificate on the local system:
    dotnet dev-certs https --trust

If no certificate was previously trusted then you’ll see a confirmation dialog, and should click the Yes button to trust the certificate.

Certificate trust

If your developer certificate is already trusted you will see a message similar to this:

    A valid HTTPS certificate is already present.

In this case no further action is needed.

Recommended Chrome Plugin

If you use the Chrome browser there is a great extension called JSON Viewer:

JSON Viewer Plugin

This plugin will format and color code JSON data when viewed in the browser, and makes in-browser testing of a REST APIs GET operations a much better experience.

Install Harmony Core Project Templates

The next thing we will do is download and install the Harmony Core project templates. The templates are provided by NuGet packages hosted on

  1. Type the following command to install the Harmony Core project template:
    dotnet new -i Harmony.Core.ProjectTemplates
  1. Next type this command to install the DevPartner 2018 Harmony Core workshop specific project template:
    dotnet new -i Harmony.Core.WorkshopTemplates2018
  1. Now let’s verify that the project templates were successfully installed. Type this command:
    dotnet new --list

You should see a list of installed project templates, and you should see two Synergy templates named hcworkshop and harmonycore:

Project Templates

The harmonycore project template is the main Harmony Core template that is intended for mainstream use. It provides a Visual Studio solution containing the projects suggested for getting started with Harmony Core development. Specifically the template provides:

  • A repository project named Repository, containing an empty repository schema. This is a placeholder for you to add or reference your own repository.

  • A .NET Core class library project named Services. This is a place for you to define your data models, web services and service configuration code. The project starts off with some suggested folders named Controllers, Models, and wwwroot, and we include a very minimal Web API controller to make it easy to get a basic web service up and running with minimal effort. This controller is in a file named ValuesController.dbl, and may be deleted as soon as you start to define your own services.

  • A .NET Core console application project named Services.Host which contains a minimal self-hosting program that is pre-configured configured to host and expose services defined in the Services project.

  • A Templates folder containing the Harmony Core CodeGen templates that can be used to automatically generate services based on your repository.

  • A Solution Items folder containing a CodeGen user-defined templates file and a batch file containing the typical CodeGen commands that can be used to generate code to define, configure host, test and document Harmony Core services.

  • The various projects have the necessary project and NuGet package references already in place.

The harmonycore template is intended for every day use in your actual development environments, but it is only a time saver, a way of quickly creating a starter environment that will be well suited to most requirements. You may also choose to setup the environment completely from scratch in Visual Studio by creating and configuring individual projects.

The hcworkshop template includes everything that the harmonycore template provides, and in addition provides:

  • A repository schema populated with a sample data set.
  • Sample data in the form of sequential files.
  • XDL files to allow the creation of ISAM files for the sample data set.
  • A .NET Core project named Services.Test which is a placeholder for implementing unit tests for services.
  • A pre-configured and working instance of IdentityServer 4 Community Edition, a popular OAuth / OpenID Connect server that can be used to testing environments requiring authentication and authorization.

This template was specifically designed for use in this workshop and should not be used for other purposes.

Creating a Development Environment

Back to top

We now have the basic components that we need to create a new Harmony Core development environment that we will work in for the remainder of the workshop.

  1. Move to a suitable location where you are able to create a new development folder, perhaps your Documents folder, or Desktop folder, or some other place where you have read/write access.

  2. Type the following command to create the development environment:

    dotnet new hcworkshop -n HarmonyCoreWorkshop -o HarmonyCoreWorkshop

You should see lots of messages scroll by, starting with: The template "Harmony Core Workshop 2018" was created successfully.

`Processing post-creation actions...`

And ending with:

`Restore succeeded.`

This command uses the project template named hcworkshop to create a new solution named HarmonyCoreExample and places it into a new subfolder or the same name.

We should now be able to build and even run the environment:

  1. Move into one of the new project folders, like this:

    cd HarmonyCoreWorkshop\Services.Host

  2. Build the project:

    dotnet build

You should see the project build, like this:

First Build

  1. And then run the project:

    dotnet run

You should see an application start and it should display output like this:

First Run

The environment provided by the project template implements a single very simple OData web service known as the Books service.

  1. Open a web browser and navigate to the following URL:


You should see that the web service responds with a collection of two values, like this:

Books Data

This confirms that the base environment is configured and working correctly. For now let’s shut down the server process:

  1. Return to the command prompt window that is running the application then type Ctrl+C to stop the application.

It is important that you understand that .NET Core is a platform independent environment and does not depend on things like Visual Studio. We could absolutely continue with this workshop by using any text editor, and using the .NET Core CLI tools to build, test and run our projects.

Open the Solution in Visual Studio

We won’t be very productive if we continue to work at the command prompt, so let’s jump into something a little more advanced, like Visual Studio:

  1. Move back up into the solution folder, then open the development environment in Visual Studio, like this:

    cd .. HarmonyCoreWorkshop.sln

You should see Visual Studio 2017 start and open the solution that we created at the command line.

  1. If Solution Explorer is not visible then open it by typing Ctrl+Alt+L.

  2. In the Solution Explorer toolbar locate and click the Collapse All button.

  3. Right-click Services.Host and select Set as StartUp Project.

It is important to understand that although we will be working with a sample repository and sample data, the fundamental steps will be the same even if you were working with your own repository and data files. There are some requirements, assumptions and restrictions, but for many Synergy data files the process of exposing the data via a Harmony Core service should be essentially the same process as you will encounter here.

Explore the Books Sample

  1. Review Models\BooksModels.dbl
  2. Review BooksData.dbl
  3. Review Controllers\BooksController.dbl
  4. Review Startup.dbl

Explore the Sample Repository

Let’s take a few minutes to familiarize ourselves with the repository that we will be using:

  1. From the menu, select Tools > Synergy/DE Repository. Take a few minutes to familiarize yourself with the sample repository. The essentials of which are:

  • Each structure of course has various fields defined, and importantly has all of the associated ISAM file keys correctly defined:

  • Each structure has various relationships to other structures defined. For example the CUSTOMER structure has a relationship to the ORDERS structure (orders for that customer), which in turn has a relationship to the ORDER_ITEMS structure (line items for each order). That file has a relationship to the ITEMS structure (the item order on each order line item), which in turn has a relationship to the VENDORS structure (the vendor that provides the item). Defining these relationships is key to getting the most out of a Harmony Core web service.

  • Five file definitions, each of which associates one of the structure defintions with a physical ISAM or RELATIVE data file.

  1. Close the repository program

Remove the Books Sample Code

We won’t be using the Books sample code any more, so let’s remove it.

  1. Use Solution Explorer to remove and delete the following files:

    Controllers\BooksController.dbl Models\BooksModels.dbl BooksData.dbl

Building Basic Web Services

Back to top


Environments other than Synergy don’t understand Synergy records and fields, so we expose records as data objects, and these objects have properties that represent the fields in the underlying record. But instead of the alpha, decimal and integer data in the Synergy records, the equivalent properties use .NET CLS types, making it possible for other environments to consume and interact with the data. We describe the source code for these data objects in model classes, each of which is a lightweight representation of a Synergy record.

In addition to the model class, a metadata class is also used to represent each synergy record. Where the model class is more concerned with exposing the actual data, the metadata class provided additional information about the properties (fields) in the model class.

As you already know, in the sample environment that we’ll be working with there are five data structures called CUSTOMERS, ITEMS, ORDERS, ORDER_ITEMS and VENDORS.

We’ll be using CodeGen to generate model and metadata classes for these five structures, based on generic code that has been defined in two template files.

We’re not going to spend a lot of time looking at the CodeGen templates in detail, because what they produce is more important that what they contain. The templates as shipped are intended to provide you with a good starting point that will provide what many developers need, but they may not work for everyone, and they may not work with every record layout or data file. Ultimately if you chose to work with Harmony Core then you may need to spend some time learning about CodeGen, and about the supplied templates, so that you can modify them to meet your exact needs.

For now, we’re just going to use some of the templates to generate the data model and metadata classes that we discussed earlier.

Review User-Defined Tokens

Before we can use CodeGen we need to configure a few options in the supplied environment to know about the name of the project that we just created. This is to ensure that we generate source files into the correct locations, and into the correct namespaces for our project.

First we’ll edit a user-defined token file that was provided in your starter environment:

  1. Look in Solution Items and edit the UserDefinedTokens.tkn file.

This file contains the definitions for several user-defined tokens that are used by many of the Harmony Core CodeGen template files. These user defined tokens allow you to determine some of the values that will inserted into various code as it is generated from the templates.

The tokens already contain data, and these values are pre-configured to provide a working set of basic functionality out of the box when used with the other files provided by the project template.

To specify a value for a user defined token we simply type the value between the opening and closing tags. Notice that towards the top of the file there are some user defined tokens like this:


These tokens, for example, are used to determine the namespaces that will be used when generating code. You will notice that their values correspond to some of the project names provided by the project template.

Other tokens in the file are used to inject various other values into generated code. If you want to know more about how each one is used, search for the name of each token within the various template files provided in the Templates folder.

  1. Close the file.

Configure the Code Generation Batch File

We’re almost ready to start generating code, but first we need to configure a batch file that we will be using to issue our CodeGen commands.

  1. Look under Solution Items and edit the regen.bat file.
  2. Near the top of the file look for a comment “Specify the names of the projects to generate code into:”

You will see that three environment variables are set, like this:

set ServicesProject=Services
set HostProject=Services.Host
set TestProject=Services.Test

The ServicesProject, HostProject and TestProject environment variables are used to define the folders and namespaces into which subsequent code will be generated.

  1. Just below look for a comment “Specify the names of the repository structures to generate code from:”

  2. Add the names of our five repository structures, like this:


The STRUCTURES environment variable is used to define which repository structures we process with CodeGen, and hence which source files are produced, based also on the templates that we select.

In the sample environment we have also implemented a mechanism that enables appropriate primary key values to be generated when new records are added via POST operations. This basically involves a “system parameters” relative file that defines data such as “next customer number” and “next order number”. You’ll learn more about how this mechanism works later in the workshop, but for now we simply need to name the repository structure that defines the record layout of this relative file:

  1. Scroll down a little and locate the comment “Specify optional "system parameter file" structure”, then name the structure, like this:


  2. Type Ctrl+S to save changes.

Generate the Initial Code

If you continue to scroll down in regen.bat you will find a comment “Generate a Web API / OData CRUD environment”. Just after that comment you will see two CodeGen commands:

codegen -s %DATA_STRUCTURES% ^
        -t ODataModel ODataMetaData ODataController ^
        -o %SolutionDir%%ServicesProject% -tf ^
        -n %ServicesProject% ^

codegen -s %DATA_STRUCTURES% -ms ^
        -t ODataDbContext ODataEdmBuilder ODataStartup ^
        -o %SolutionDir%%ServicesProject% ^
        -n %ServicesProject% ^

These commands cause CodeGen to process the structures listed in the STRUCTURES environment variable, in conjunction with various template files named ODataModel, ODataMetaData and ODataController then ODataDbContext, ODataEdmBuilder and ODataStartup for the second command.

The -o option determines where the output files will be created and the -n option determines the namespace that will be used when generating the code.

The STDOPTS environment variable contains several other command line options that are used as standard in this environment. The value is defined near the top of the file, so you can check it out if you want to know more.

When the first CodeGen command runs it will produce a source file for each of the five structures, in combination with each of the three templates; a total of fifteen new source files should be produced.

The second CodeGen command operates a little differently; notice the -ms (multiple structures) option, which causes all five of the structures being processed to be available to each template all at the same time, so this command will only produce three source files, one per template, and the code in each of those three files will reference all five of the structures being used.

Although we will be using CodeGen heavily to produce almost all of the code that will be used, this is not a CodeGen workshop and we won’t have time to get into a lot of detail about how CodeGen and template files work. But it is important that you have a basic understanding of what is happening when we run CodeGen, so let’s take a look at one of the template files that we’re about to use to generate code.

  1. Edit Templates\ODataModel.tpl but be very careful not to change anything while editing the file.

At the top of the file you will see some special tokens that look like this:


These are called file header tokens because if used they must always appear at the top of the template file.

The <CODEGEN_FILENAME> token determines the name of the output file that will be created, in this case it also uses a token to inject the name of the structure that is being processed at the time. So when we process the CUSTOMERS structure, we’ll get an output file named Customer.dbl. Notice the final S of the name was dropped; that’s the nature of the “Noplural” tokens.

The <REQUIRES_CODEGEN_VERSION> token is self-explanatory, and the <REQUIRES_OPTION> and <CODEGEN_FOLDER> tokens work together to specify that when code is generated it will be written to a Models sub-directory of the specified output directory.

The remainder of the template file contains a combination of code (Synergy .NET code in this case) and CodeGen tokens. Some of the tokens result in various values being injected into the output source file, while others influence how the template and repository structure are processed. For example, if you scroll down you will find a <FIELD_LOOP> token, and further down in the code a matching </FIELD_LOOP> token. This determines that any code between those tokens is inserted for each field in the structure.

This template produces a “data class” that implements an OO representation of a Synergy record. The class will have public properties that represent the fields in the record, and those properties will have .NET CLS data types, so that the data can be used in environments other than Synergy. The class has various other members also.

  1. Close the template file.

We’re going to need to execute the regen.bat batch file many times during this workshop, so we’re going to add a custom tool to the Visual Studio Tools menu to make that easy to do:

  1. From the Visual Studio menu select Tools > External Tools

  2. Click the Add button

  3. Set the Title field to Generate Code

  4. Set the Command field to $(SolutionDir)regen.bat

  5. Set the Initial directory field to $(SolutionDir). You can do this by clicking the drill button to the right and selecting Solution Directory

  6. Ensure the Use Output window option is checked

  7. Click OK to create the new tool. Now we should be able to generate our first code; let’s give it a try.

  8. From the menu select Tools > Generate Code.

  9. Check the Output window and you should see the codegen commands that were executed, and output like this:

    Task complete, 15 files generated.
    - Customer.dbl
    - Item.dbl
    - Order.dbl
    - OrderItem.dbl
    - Vendor.dbl
    - CustomerMetaData.dbl
    - ItemMetaData.dbl
    - OrderMetaData.dbl
    - OrderItemMetaData.dbl
    - VendorMetaData.dbl
    - CustomersController.dbl
    - ItemsController.dbl
    - OrdersController.dbl
    - OrderItemsController.dbl
    - VendorsController.dbl
    Task complete, 3 files generated.
    - DbContext.dbl
    - EdmBuilder.dbl
    - Startup.dbl

The first CodeGen command caused fifteen new source files to be generated (three per structure), and the second caused a further three files to be generated. We’ll look at each type of file shortly, but first we should add the new files to our project so we can view and build them:

  1. In Services right-click Models folder and select Add > Existing Item, navigate to the Services\Models folder, select all ten source files and click Add.

  2. Right-click the Controllers folder and select Add > Existing Item, navigate to the Services\Controllers folder, select all five source files and click Add.

  3. Right-click the Services project and select Add > Existing Item, navigate to the Services folder, select all three source files and click Add. Let’s examine and understand the code that was just generated:

Examine a Model Class

  1. Edit Models\Customer.dbl.

This is a data class that represents the data of a CUSTOMER record. Instances of this class are customer data objects.

The main purpose of the class is to internally store the value of a CUSTOMER record (in the private field mSynergyData) as well as a copy of the original record (in the private field mOriginalSynergyData) if the instance was originally constructed from an existing record.

There are constructors to create blank and populated instances, and the main bulk of the code consists of properties that represent the fields in the underlying Synergy record, as .NET CLS types.

There are properties that expose the current state of the full record, as well as the original state of the full record, and also a property that exposes a copy of the metadata related to the type. That will be discussed next.

A data class is provided for each of the structures being processed.

Examine a MetaData Class

  1. Edit Models\CustomerMetaData.dbl

As its name suggests, this class exposes metadata related to the CUSTOMER type to the internals of Harmony Core. You will see that code in the class declares information about the fields and keys in the structure, and provides information relates to things like underlying Synergy data type, length of field, position in the overall record, any decimal precision, and so on.

The code also declares to Harmony Core any and all fields that are involved with the structures keys.

Other utility code is present to, for example, assist in preparing literal key values for the various keys present in the structure.

A metadata class is provided for each of the structures being processed.

Examine the EdmBuilder Class

  1. Edit EdmBuilder.dbl

EdmBuilder stands for “Enterprise Data Model Builder” and is an Entity Framework class that is responsible for describing the structure of the data being used.

In this case you will see that the code declares that there are five “entities” being used:

;;Declare entities

There is also code that describes information about any segmented keys (primary key information is declared via a different mechanism):


And there is code that declares any alternate keys that are present in each entity type:

data itemType = (@EdmEntityType)tempModel.FindDeclaredType("Services.Models.Item")
tempModel.AddAlternateKeyAnnotation(itemType, new Dictionary<string, IEdmProperty>() {{"VendorNumber",itemType.FindProperty("VendorNumber")}})
tempModel.AddAlternateKeyAnnotation(itemType, new Dictionary<string, IEdmProperty>() {{"FlowerColor",itemType.FindProperty("FlowerColor")}})
tempModel.AddAlternateKeyAnnotation(itemType, new Dictionary<string, IEdmProperty>() {{"Size",itemType.FindProperty("Size")}})
tempModel.AddAlternateKeyAnnotation(itemType, new Dictionary<string, IEdmProperty>() {{"CommonName",itemType.FindProperty("CommonName")}})

Examine the DbContext Class

  1. Edit DbContext.dbl

A DB Context is an Entity Framework type that represents the overall database being used; it exposes members through which the underlying data can be accessed and manipulated.

In this case you will see that its primary purpose is to expose properties that represent all of the various data types that are being exposed in the environment. For example, you will see the following property related to CUSTOMER:

;;; <summary>
;;; Exposes Customer data.
;;; </summary>
public readwrite property Customers, @DbSet<Customer>

Additional code will be added to the DbContext class as the workshop progresses. For example, later code will be added that declares the various relationships between all of the data entities.

The DbContext is declared as a “service” in the ASP.NET Web API environment, and instances of it are provided (via dependency injection) to the controller classes that make comprise the RESTful Web Services environment. This is all configured in the startup class, which you will see very soon.

An instance of the DbContext class provides the mechanism by which your code interacts with the back-end data, via the Harmony Core Entity Framework provider. Examine a Controller Class

  1. Edit Controllers\CustomersController.dbl

If you have ever worked in an ASP.NET MVC or ASP.NET Web API environment then you will already be familiar with the concept of controller classes, which are the place where the individual operations (the URL’s or “endpoints”) of a web application or service are implemented. The same is true here.

Each of the controller classes has a constructor which receives an instance of the DbContext object via dependency injection and stores the reference in a local instance variable for subsequent use by the various methods in the controller.

Each controller currently has two methods (operations) defined:

  • A Get method that returns all entities in the collection (e.g. all customers).
  • A Get method that accepts primary key data and returns a single entity (e.g. a single customer).

Attributes are used to declare the “route” (or URL path) that the operation is available at. For example the get all customers operation is available at the base address of the service, suffixed by /Customers.

The EnableQuery attribute can be used to configure various behaviors and capabilities of the OData environment, as you will see later in the workshop.

Many additional methods (operations) will be added to these basic controller classes as we progress through the workshop.

A controller class is provided for each of the structures being processed.

Examine the Startup Class

  1. Edit Startup.dbl

The final class that was generated was the Startup class, which is responsible for configuring the whole ASP.NET Core MVC, Web API and OData environments, as well as various other optional components.

Currently the code in the file:

  • Configures and loads various Harmony Core services
  • Configures and loads OData services
  • Configures and loads ASP.NET MVC
  • Configures the use of HTTPS and redirects all HTTP traffic to an HTTPS endpoints
  • Enables the use of the “developer mode” error pages
  • Enables dependency injection
  • Configures the default route for the service to be “/odata”

Once again additional code will be added to the Startup class as we progress through the workshop and enable additional features.

  1. Close all of the open source files. If you are prompted to save any changes, DO NOT SAVE!

Build the Code

Let’s verify that the code that we just generated builds successfully:

  1. Type Ctrl+Shift+B to build the solution

  2. Check the Output window and make sure you see output similar to this:

    1>------ Build started: Project: IdentityServer, Configuration: Debug Any CPU ------
    2>------ Build started: Project: Services, Configuration: Debug Any CPU ------
    1>IdentityServer -> D:\Conference2018\06_Harmony_Core_Workshop\HarmonyCoreWorkshop\IdentityServer\bin\Debug\netcoreapp2.1\IdentityServer.dll
    3>------ Build started: Project: Services.Host, Configuration: Debug Any CPU ------
    4>------ Build started: Project: Services.Test, Configuration: Debug Any CPU ------
    ========== Build: 4 succeeded, 0 failed, 1 up-to-date, 0 skipped ==========

Run the Code

We now have an Entity Framework / OData environment, and a basic (albeit read only) collection of RESTful web service endpoints. And we also have a basic self-hosting environment that was provided by the project templates, so we should be able to run our code, and the host program should know about and expose our services.

  1. Press F5 start the host program.
  2. Open a web browser and go to https://localhost:8086/odata

You should see something like this:

OData Endpoints

The hyperlink at the top of the page exposes the metadata for our services:

  1. Click the metadata hyperlink https://localhost:8086/odata/$metadata

This metadata is in a standard and well-known form. There are lots of tools available that will consume the meta data and do things like generate client-side code that can simplify the process of interacting with our service, and these tools are available for a wide range of development platforms and languages.

  1. Close the metadata tab.

And the host knows about our service endpoints (our controllers). It is offering up a list of the five main entity types that it has discovered and made available (Customers, Items, Orders, OrderItems and Vendors).

However, we haven’t done anything to make our data available in our hosting environment, so the services may be available, but they can’t currently return any data.

  1. In the browser, go to https://localhost:8086/odata/Customers

You should see that the response looks like this: {"@odata.context":"https://localhost:8086/odata/$metadata#Customers","value":[

The response is essentially indicating an empty array of type Customer. Our controllers Get method WAS called, but didn’t return any data.

  1. Close the browser window, and type Ctrl+C in the host program window to close the program.

Configuring a Self-Hosting Environment

Back to top

The next thing we need to do is provide some additional code into the Services.Host project; adding self-hosting code that knows about our Harmony Core environment that our Services project is exposing.

Again we will be using CodeGen to provide the base environment that we need.

Generate a new Self-Host Program

  1. In Solution Items edit regen.bat and un-comment the set commands for ENABLE_SELF_HOST_GENERATION and ENABLE_CREATE_TEST_FILES.

These two options cause two things to happen

  • An additional CodeGen command to be executed to generate a new SelfHost.dbl program:

      codegen -s %FILE_STRUCTURES% -ms ^
              -t ODataStandAloneSelfHost ^
              -o %SolutionDir%%HostProject% ^
              -n %HostProject% ^
  • The enabling of code within the generated program to create and load the ISAM files each time the host program starts, and delete them again when the program ends.

  1. Type Ctrl+S to save changes then select Tools > Generate Code.

The information in the Output window should now end with:

    Task complete, 1 file generated.

    - SelfHost.dbl

Examine the Self Host Program

  1. Edit Services.Host\SelfHost.dbl

The mainline of the self-hosting program is not too different to the version initially provided by the project template, but you will notice the addition of two new method calls to initialize the environment before host startup, and clean up before program end.

Notice the code UseStartup(). This is a reference to the Startup class that we generated and inspected earlier, so this part of the statement configures the environment according to all of the code in our Startup class, and then calls the .Run() method which starts the server running in a console window. The server will continue running until we manually close it down, at which time the Cleanup method will run.

The Initialize and Cleanup methods are below, they essentially:

  • Ensure that environment variables that are needed to open data files are set.
  • Delete any data files that are already present in the data folder.
  • Create a new set of data files in the data folder

The idea is to ensure that in our development environment the data is reset to a known state each time the self-hosting program runs. This behavior is set by the ENABLE_CREATE_TEST_FILES option that we enabled earlier; if that setting is not present then the data files will not be created and deleted on startup and shutdown, but the default environment will still set logical names to the SampleData folder and assume that all files are present in that location.

Start the Self-Host Program

It’s finally time to start our self-hosting program and interact with our own web services:

  1. Press F5 to start the program.

You should once again see the console window appear with text similar to this:

    Hosting environment: Development
    Content root path: C:\Program Files\dotnet
    Now listening on: http://localhost:8085
    Now listening on: https://localhost:8086
    Application started. Press Ctrl+C to shut down.
  1. Look in the SampleData folder, you should find that the five ISAM files have been created and loaded with data.

These data files will be deleted again when the hosting program closes.

  1. Leave the program running.

Test Basic Functionality in a Browser

To verify that the service is running we can ask it to display information about its available endpoints, which we do by navigating to the root path of the OData service:

  1. Open a Web browser and navigate to https://localhost:8086/odata

You should once again see a response that includes the hyperlink to the service metadata, and a list of the main entities being exposed.

From this information we can determine that we can go to the URL for any of these endpoints and expect to see a collection (EntitySet) of entities of that type. Let’s give that a try:

  1. Modify the URL in the browser to https://localhost:8086/odata/Customers and press enter.

You should see the data for a collection of customers returned.

  1. Modify the URL in the browser to https://localhost:8086/odata/Customers(8)

You should now see the data for the single customer that you selected. You should be able to repeat the same process with the other four endpoints.

  1. Switch focus back to the self-host program and type Ctrl + C (you may need to do it twice) to close the program.

You will notice that the program has been logging the activity that resulted from our calls.

Generating API Documentation

Back to top

Every good web API includes documentation that assists developers to understand how to interact with the service, and in many cases allows developers to actually interact with the service in real time to experience how the service works.

In the world of RESTful web services a set of open source tools called Swagger (from SMARTBEAR) has emerged as a very popular way of modelling and documenting RESTful API’s.

For ASP.NET Web API there is a toolset called Swashbuckle that is able to parse the WebAPI controllers and automatically generate the swagger data necessary to generate a Swagger UI. Unfortunately that toolset doesn’t currently work well with OData services, although it appears that the developers are working on that.

So for now the approach we will take is to use CodeGen to generate the swagger data in a file, then feed that file into the Swagger UI subsystem to produce documentation.

Swagger is an extensive toolset, and we’re not going to get into great detail about what it is or how it works. We’re simply going to use one small part of Swagger that is known as Swagger UI. It’s a tool that can be used to interpret information that defines the endpoints and data structures of a Web API and is presented in a “swagger file”, and present the information in the form of a Web-based UI that allows developers to browse the details of the API, and interact with the actual service.

  1. Edit regen.bat and locate and un-comment the ENABLE_SWAGGER_DOCS option.

Enabling this option causes one additional CodeGen command to be executed:

    codegen -s %DATA_STRUCTURES% -ms ^
            -t ODataSwaggerYaml ^
            -o %SolutionDir%%ServicesProject%\wwwroot ^

And also causes additional code to be generated in the Startup class and SelfHost program.

  1. Type Ctrl + S to save changes then select elect Tools > Generate Code from the menu.

  2. Check the Output window, the most recent messages should be like this:

        Task complete, 1 file generated.
        - SwaggerFile.yaml
  3. Add the new file to the wwwroot folder in the Services project.

  4. Select the SwaggerFile.yaml file and then press F4 to display the file properties window.

  5. Set the value of Copy to Output Directory to Copy always

  6. Close the properties window.

A swagger file is a file that describes the capabilities of a RESTful API and the data structures exposed by the API. Swagger files are usually presented in a format called YAML (YAML Ain’t Markup Language!), and can optionally use a JSON format also. We use the YAML format, because the YAML structure is very well suited for code generation.

  1. Edit SwaggerFile.yaml but DO NOT CHANGE ANYTHING, NOT EVEN SPACES!

You will see that the file contains a hierarchical definition of our entire API and data models. Note that the content of this file will constantly change each and every time we generate code with different options in our regen script.

  1. Close the file.


The hosting environment that we have generated now configured to serve up the swagger documentation, and we should be able to view the Swagger API documentation for our RESTful Web API:

  1. Press F5 to start the self-hosting environment.

When the self-host application starts you should see an additional information message announcing that API documentation is available and providing the documentation URL.

  1. When the server console application appears, go to your web browser and navigate to https://localhost:8086/api-docs

  2. Take a few minutes to browse around in the API documentation and familiarize yourself with how it presents its self and works.

  3. In particular try the Try it out functionality which lets you interact with the actual running service in real time.


Notice that at the top of the page there is a link to the underlying swagger file (the YAML document). Like with the OData metadata, there are tools available that will consume the raw swagger data and again allow you to do things like generate client-side libraries to interact with the exposed services.

  1. Close the browser but leave the host application running.

Generating Postman Tests

Back to top

Another tool frequently used by developers to interact with HTTP-based services is an application called Postman. If you don’t already have it installed, you might want to download it from


  1. Edit regen.bat then locate and un-comment the ENABLE_POSTMAN_TESTS option then type Ctrl + S to save changes.

Enabling this option causes one additional CodeGen command to be executed:

    codegen -s %DATA_STRUCTURES% -ms ^
            -t ODataPostManTests ^
            -o %SolutionDir% ^

This command generates a single file, this time containing data in the same format that Postman exports information about tests that have been defined in the UI. It is possible to import these tests from the generated file into the Postman UI for use.

  1. Select Tools > Generate Code from the menu. The content of the Output window should end with:

        Task complete, 1 file generated.
        - PostManTests.postman_collection.json

We just generated was a JSON file that conforms to the structure of a Postman export file. We can import that file into Postman in order to add tests for our environment.

  1. Start Postman

The first time you start Postman you will need to configure it to enable communication with web sites that are using self-signed TLS certificates.

  1. From the Postman menu select File > Settings.

  2. Ensure that the SSL Certificate Verification option is turned OFF.

  3. Close the settings window

Now we can import our test definitions

  1. From the Postman menu select File > Import.

  2. Click the Choose Files button

  3. Browse to your HarmonyCoreWorkshop folder, select the file PostManTests.postman_collection.json and click Open.

You should now see a new tests collection named Harmony Core Sample API and below it are several tests you can use. Note that you need to provide data in several places, for example in URL parameters and sometimes in the body of requests.

STEVE: Demonstrate basic functionality

  1. Close Postman

  2. Switch to the console application that is hosting the services and type Ctrl + C to close the application.

If you want to continue using Postman for testing during the remainder of the workshop then each time we make a change you will need to re-import the Postman tests, like we just did. This is because additional information is generated to the file as we enable more and more optional features.

Adding Alternate Key Endpoints

Back to top

Currently our environment allows us to read all or specific records for each of our entity types, all of which happens by accessing the underlying data files by primary key.

But it is likely that most ISAM files have several alternate keys that provide efficient access to the data in other ways, and if it made sense for the original application to be able to access the date that way, then it almost certainly makes sense for the web service to offer endpoints to allow the data to be accessed via those keys.

We’re going to enable another option in our code generation script, but this time no additional files are going to be generated, but rather the option will simply enable additional areas of code within the template files that we’re already using in order to cause additional code to be generated within those files.

  1. Edit regen.bat, locate and un-comment the ENABLE_ALTERNATE_KEYS option then type Ctrl + S to save changes.

  2. From the Visual Studio menu select Tools > Generate.

The content of the Output window will be the same as previously, but additional content has been generated into the controller classes, documentation files, etc.

  • New endpoint methods were added to all controllers.
  • Additional endpoints were documented in the swagger file.
  • Additional tests were added to the postman file.

Let’s build and test the changes:

  1. Type Ctrl + Shift + B to build the solution and check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

  3. Use the API documentation or Postman to explore the new alternate key endpoints.

  4. Switch focus to the self-host window and type Ctrl + C to stop the application.

Adding Support for Result Set Counts

Back to top

OData services has an optional feature that allows operations that return a collection of entities to instead only return the count of entities that would have been returned.

Relational databases have the SELECT COUNT(COLUMN feature that very efficiently obtains the number of rows that match some query, where as in ISAM it is still necessary to run the full query in order to know the result count, but all of that work happens in the back end, and does not propagate all the way through the web service, so it can still be a useful feature to use.

The feature is used by appending /$count to the end of a query URL, but the feature must first be enabled in the service:

  1. Edit regen.bat, locate and un-comment the ENABLE_COUNT option then type Ctrl + S to save changes.

  2. From the Visual Studio menu, select Tools > Generate.

The content of the Output window will be the same as previously, but additional content has been generated into the controller classes, documentation files, etc.

  • The Startup class now has a new statement builder.Count() in the OData configuration section.
  • Additional endpoints were documented in the swagger file.
  • Additional tests were added to the postman file.
  1. Type Ctrl + Shift + B to build the solution and check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

  3. Use the API documentation or Postman to explore the new collection count endpoints.

  4. Close the self-host program.

Adding Individual Property Endpoints

Back to top

In addition to returning entire entities, a customer for example, OData services have the ability to return individual properties of an entity. For example a client could request only the Name of a customer, or only the price of an inventory item.

These properties allow client developers to be really specific about exactly what data they need at any point in time, and these endpoints are very efficient and fast because they return such a small amount of targeted data.

An example of the data returned by one of these endpoints might look like this:

        "@odata.context": "https://localhost:8086/odata/$metadata#Customers(8)/Name",
        "value": "Broadway Nursery"

As you can see, the data returned is fairly minimal. But there is another option that is even more efficient in terms of the size of the data returned. By appending “/$value” to the URL it is possible to request that only the raw value of the data is returned, like this:

    Broadway Nursery

The Harmony Core CodeGen templates include an option to generate these individual property endpoints, and that’s what we are going to do now:

  1. Edit regen.bat, locate and un-comment the ENABLE_PROPERTY_ENDPOINTS option then type Ctrl + S to save changes.

  2. From the Visual Studio menu select Tools > Generate Code.

Again, no new files were generated, but additional content has been added to several files:

  • A large number of new endpoint methods were added to all controllers.
  • Additional endpoints were documented in the swagger file.
  • Additional tests were added to the postman file.

Let’s build and test the new features:

  1. Type Ctrl + Shift + B to build the solution and check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

  3. Use the API documentation or Postman to explore the new individual property endpoints.

  4. Close the self-host program.

Adding Query Support

Back to top

One of the most powerful features of OData services is the ability to:

  • Specify selection criteria to define a subset of properties to be returned, via $select.
  • Specify expressions to restrict the entities that are returned in a collection, via $filter.
  • Specify ordering criteria for collections, via $orderby.
  • Restrict the number of entities that are returned in a collection, via $top.
  • Specify that a certain number of entities should be ignored before starting to return a collection, via $skip.

By using one or more of these capabilities it is possible for client applications to really focus in on exactly what data they need, and without the developer of a service to anticipate the needs of client applications and having to provide a myriad of alternate endpoints and

Again these are all individual features that can be enabled if required, and that’s what we’re going to do now:

  1. Edit regen.bat and locate and un-comment the following options:
  1. Type Ctrl + S to save changes.
  2. From the Visual Studio menu select Tools > Generate.

No new files were generated, this is what changed:

  • Some new OData configuration options were added in the Startup code.
  • Additional capabilities were declared in the swagger file.
  1. Type Ctrl + Shift + B to build the solution then check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

  3. Use a browser, the API documentation or Postman to explore the new query parameter options. Try these operations:

    a. All customers, customer number and name only.

    b. Customers in Washington state.

    c. Customer credit limits, highest to lowest

    d. Three most expensive items

    e. Next three most expensive items

  4. Close the self-host program.

Expanding Relations

Back to top

Another extremely powerful feature of OData services, if your repository metadata supports it by including information about relations between structures, is the ability to expose and follow relations between entities when querying.

For example, in our sample data we have customers. Each of those customers may have orders, each of which has one or more order line items, and each line items refers to an inventory item, each of which is supplied by a specific vendor.

If enabled, OData services allow you to follow these relationships when constructing queries, the result being the return of hierarchical information from multiple tables.

Let’s enable these capabilities in our services:

  1. Edit regen.bat, locate and un-comment the ENABLE_RELATIONS option then type Ctrl + S to save changes.

  2. From the Visual Studio menu select Tools > Generate Code.

Once again no additional files were generated, but additional code was generated in several areas:

  • Additional properties were added to the model classes to represent relationships to other structures. Some of these properties are defined as collections of other entities (representing one-to-many relationships) while others are individual entities (representing one-to-one relationships).

  • Additional code was added to the metadata classes to declare the presence of the new relation properties, and initialize them when new data objects are created.

  • A new OData option (builder.Expand()) was added to the OData configuration code in the Startup class.

  • Additional parameter options (support for $expand) was added to the swagger documentation.

  1. Type Ctrl + Shift + B to build the solution then check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

  3. Use a browser, the API documentation or Postman to explore the new query options.

  4. Close the self-host program.

Adding Create, Update and Delete Endpoints

Back to top

Everything we have done so far has resulted in endpoints that allow us to query data in a variety of ways by issuing HTTP GET requests. But many web services also allow data to be created, updated and deleted. Next we will extend our web service to support those operations.

Once again we’ll be generating additional operation methods into our web service controller classes, and once again we can control which operations are added by enabling various options within our regen script.

The operations are as follows:

HTTP METHOD | Capability

  • | - POST | Create a new entity; primary key values are automatically assigned by the service. PUT | Replace (full update) an existing entity, creating a new one if it does not exist. PATCH | Update an existing entity, altering specific properties only. DELETE | Delete an entity.
  1. Edit regen.bat then locate and un-comment the following options
  1. Type Ctrl + S to save changes.

  2. From the Visual Studio menu select Tools > Generate Code.

The content of the Output window will be the same as previously, but additional content has been generated into the controller classes, documentation files, etc.

  1. Type Ctrl + Shift + B to build the solution.

  2. Check the Output window and verify that the build was successful.

STEVE: Explain what changed

  1. Use PUT to create a new entity.

  2. Use PUT to update an existing entity.

  3. Use POST to create a new entity.

  4. Use PATCH to update an entity .

  5. Use DELETE to delete an entity.

Adding Custom Endpoints via "SPROC"

Back to top

Thus far all of the endpooints and functionality that we have exposed from our service has been based on simple data-related endpoints. We have build a fairly powerful REST service, but it is entirely based on pre-defined rules and conventions, and on automatically generated code.

While the functionality of our service is impressive, and may completely satisfy the requirements of some applications, it is likely that it will be necessary to expose hand-crafted custom functionality.

There are several ways of adding custom coded functionality in a Harmony Core environment, one of which is the "Stored Procedure" or "SPROC" mechanism, and that is what we'll use here.

The "SPROC" mechanism enables arbitrary code in the public methods of a class to be exposed as web service operations. In order to enable this functionality classes containing such functionality are decorated with a {Controller} attribute.

First off we need to make a change to our Startup class to enable the use of SPROC support:

  1. Edit regen.bat then locate and un-comment the ENABLE_SPROC option.

  2. Type Ctrl + S to save changes.

  3. From the Visual Studio menu select Tools > Generate Code.

Enabling this option caused one additional line of code to be inserted into the Startup class:

Custom Code Routing

This code extends the Web API routing rules by adding a mechanism to route inbound requests directly to methods in classes thatn are decorated with a {Controller} attribute.

The project template that we used to create the solution that we are curently working in included a sample class containing code that can be exposed via the "SPROC" mechanism, but that source file was set to not be compiled. We need to change that now.

  1. In Solution Explorer right-click on the Services project and select Unload Project.

  2. Right-click on the Services project again and select Edit Services.synproj.

  3. Locate the project file entry for the file CustomFunctionality\OrdersMethods.dbl, which will look like this:

    <None Include="CustomFunctionality\OrdersMethods.dbl" />
  1. Change the line so that the file is included in the build, like this:
    <Compile Include="CustomFunctionality\OrdersMethods.dbl" />
  1. In Solution Explorer right-click on the Services project one more time and select Reload Project.

  2. When prompted to close the open document, click Yes.

  3. When prompted to save changes, click Yes.

Note: We would generally change the build action of a file via the Properties window, in this case changing the value of the Build Action property from None to Compile. Unfortunately, at the time of writing, there are some issues in this area in Visual Studio and we were finding that changes to the project file via the Properties window were not being reliably saved. This is why we just edited the project file manually!

We need to make one more change in order to enable the "SPROC" routing support:

  1. Edit StartupCustom.dbl and remove the comment character from this statement:
    services.TryAddEnumerable(ServiceDescriptor.Singleton<IActionInvokerProvider, HarmonySprocActionInvokerProvider>())
  1. Type Ctrl + Shift + B to build the solution then check the Output window and verify that the build was successful.

  2. Press F5 to start the hosting application.

You should now be able to use Postman to test the new custom operation. You can do this by:

  1. Importing annother Postman file named Harmony Core Custom Functionality.postman_collection.json which you will find in your main HarmonyCoreWorkshop solution directory.

You will find that a new test collection is created, containing two tests.

Create New OrderTests

The Create New Order (Method) test calls the CreateNewOrder method in the OrdersMethods class via the SPROC routing convention. If you examine the request body of this test you will notice that the JSON data being pased to the method includes both a new order, and two new order items:

Create New Order

The structure of this data coresponds to the parameters defined in the CreateNewOrder method.

Create New Order Parameters

  1. Execute the test. You should see a response like this:

Create New Order Result

Notice that the HTTP response code is 200 OK which indicates a successful completion of the method, and that the response from the method includes the value 11; which is the order number associated with the newly created order.

  1. The second test named Read Order and Items is pre-configured to retrieve order 11, so go ahead and execute it. You should see the order successfully retrieved, like this:

Create New Order Confirmation

  1. Close the self-host program.

Adding Authentication, Authorization and Field Security

Back to top

  • User authentication
  • Role-based authorization on controllers & methods
  • Role-based authorization on field / property visibility
  1. Edit regen.bat and remove the comment from the ENABLE_AUTHENTICATION and ENABLE_FIELD_SECURITY options

  2. Generate code

  3. Change the Build Action property to Compile for PrimaryKeyGenerator.dbl and StartupCustom.dbl

  4. Build

  5. Start IdentityServer (it runs in IIS Express)

  6. Start Services.Host project

  7. Start PostMan

  8. Use Get Access Token (Jodah) and copy JWT

  9. Edit collection properties, variables, and paste the JWT into the CurrentValue field for AccessToken.

  10. Test any operation, all should fail with 401 (unauthorized)

  11. Change request authorization to Bearer Token and notice the value comes from the AccessToken variable.

  12. Operations should now work.

  13. Stop the server Now we’ll restrict create, update and delete operations only to users in the Manager role (which Jodah is not)

  14. Edit UserDefinedTokens.tkn and uncomment the 5 ROLES_ values:

  15. Save the file

  16. Generate code

  17. Examine a controller

Endpoints now have Authorize attributes too.

  1. Build and start the server

  2. Have Jodah try to delete something – fails

  3. Get JWT for Manny and update the AccessToken variable

  4. Try delete again. Should work now

  5. Stop server  

Adding Unit Tests

Back to top

  • Regen
  • Caused several changes
  • Generate classes to load test data to memory
  • Generate client-side model classes
  • Generate unit test classes
  • Generate unit test self host program
  • Generate unit test environment classes
  • Add files to Services.Test project (4 folders)
  • Copy values from values.default to values
  • Build
  • Examine a data generator class
  • Examine a client side model class
  • Examine a unit test class
  • Examine TestConstants.Values class
  • Examine TestEnvironment class
  • Examine UnitTestEnvironent class
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.