Skip to content

ddialar/typescript.workshop.backend

Repository files navigation

TypeScript training course - Backend

πŸ‡ͺπŸ‡Έ Version

πŸ“– Index

πŸ” Description

This repository is aimed to provide the backend support for the TypeScript training course which it belongs to.

In this case, we are implementing the backend side of a social media application that allows to register new users and authenticate them, create and delete posts, create and delete comments on posts and like/dislike posts.

The original idea of this application is taken by this really interesting workshop named Build a Social Media App (MERNG stack), created by Classsed and published by freecodecamp.org.

In opposite to the original version, this code implements an API REST in order to access the backend features.

Most part of this code is created following the functional programming paradigm meanwhile OOP has been used only to create the error objects whose will be triggered when an exception appears.

Some tools used on this repository are next:

  • πŸ“¦ Webpack for transpiling and bundling the TypeScript code.
  • πŸ”’ JWT as token service.
  • βš™οΈ dotenv for environment variables.
  • πŸ“ Swagger for API REST documentation.
  • πŸ’Ύ Log4JS for logging tasks.
  • βœ… Joi for validating input data.
  • πŸ§ͺ Jest for unit testing, as well as supertest for API enpoints integration tests.
  • πŸ” ESLint for code linting and formating.
  • 🐢 Husky for managing the Git Hooks.
  • 🐳 Docker for container image management.
  • 🌱 MongoDB as database engine.
  • πŸ“œ manifest.json file in order to retrieve information about the running service.

Therefore this repository is defined to work with NodeJS 14.15.0 LTS.

If you are running differente versions of NodeJS in your system, just run nvm use and it will be switched to the version defined in the .nvmrc file.

πŸ’» System requirements

To run this code in your system, it must satisfy the next minimum requirements:

  • NodeJS 14.15.0
  • npm 6.14.11
  • npx 6.14.11
  • Docker 20.10.2
  • docker-compose 1.27.4

In addition, it's advisable to have next:

  • nvm 0.33.0
  • Web browser (recomended Google Chrome 88.0)
  • Database UI tool (recomended Robo 3T 1.4.1)
  • Code editor (recomended VScode 1.52.1)

πŸ‘€ Repository overview

βš™οΈ Environment variables

Due to we have selected dotenv as environmet variables handler, in the root of the project will be a folder named env.

In this folder you have to define a minimum of three differente environment files:

  • .env for production.
  • .env.dev for development.
  • .env.test for testing.

Feel free to remove some of them or including additional ones depending on your application needs. Just keep in mind that you will have to update and set up the Webpack configuration based on the environment you are going to be working on.

The different scripts created for running the application in every environment are prepared to load the configuration and applying it to the code.

For production purpouses, Webpack is already configuered in order to record that information into the final bundled file so we don't need to think about providing environment configurations to the application image.

The most basic fields we must include on these files are next:

NODE_ENV="production" | "development" | "test"

# Set the port number that best matches for your environment.
SERVER_PORT=4000

# Set the logging level that best matches for your environment.
LOGGER_LEVEL="off" | "fatal" | "error" | "warn" | "info" | "debug" | "trace" | "all"

# Set the database configuration that best matches for your environment.
MONGO_USER='tstest'
MONGO_PASS='tstest'
MONGO_HOST='localhost'
MONGO_PORT='32023'
MONGO_DB='ts-course-test'

# Set the encryption configuration that best matches for your environment.
BCRYPT_SALT=3

# Set the token configuration that best matches for your environment.
JWT_KEY='testingkey'
JWT_ALGORITHM='HS512'
JWT_EXPIRING_TIME_IN_SECONDS=60

# ⚠️ Just for being includedn into '.env.test' file
PLAIN_PASSWORD='123456'
WRONG_PASSWORD='wrongpassword'
# ⚠️ Just for being includedn into '.env.test' file

# Rest of the environment variables here.

πŸ— Architecture

This repository is implemented following the most basic Layered Architecture, it means, domain and infrastructure.

The full folders structure is next:

πŸ“‚ src/
    πŸ“‚ common/
    |   πŸ“‚ errors/
    |   πŸ“‚ logger/
    |   πŸ“‚ utils/
    πŸ“‚ domain/
    |   πŸ“‚ models/
    |   πŸ“‚ services/
    πŸ“‚ infrastructure/
    |   πŸ“‚ authentication/
    |   |   πŸ“‚ token/
    |   πŸ“‚ dataSources/
    |   πŸ“‚ dtos/
    |   πŸ“‚ mappers/
    |   πŸ“‚ orm/
    |   |   πŸ“‚ mongoose/
    |   πŸ“‚ server/
    |   |   πŸ“‚ apidoc/
    |   |   πŸ“‚ middlewares/
    |   |   πŸ“‚ routes/
    |   |   πŸ“‚ serverDtos/
    |   πŸ“‚ types/
    πŸ“‚ test
        πŸ“‚ fixtures

πŸ”„ common

On this layer we implement the set of elements that are horizontaly common to the whole application.

The folders used in this section and their targets are next:

  • errors

    This folder contains the error handling configuration for the whole application.

    In this part is where there is implemented the only OOP part of the code.

    The errors are sorted by functionality. This way, we can find the specific folders in order to group: authentication, posts, users and common errors.

  • logger

    Here is configured the logging tool used in the application.

  • utils

    There is not much more to talk about it πŸ˜…

🎯 domain

This layer is also known as entities or core in different architecture approaches.

This layer has two main goals:

  1. To define application own data structures.

    It's done into the models folder where we can find several definitions about how our application manages the information.

  2. To implement specific business logic strongly bound with the application use.

    On this so basic example of layered architecture, the business logic is defined into multiple services grouped by functionality, into the services folder.

    A quick rule to know whether a pice of code belongs to the domain layer is to ask ourself "my application is the same if I extract this code from the domain?" If the answer is NO, then this code must be placed into the domain layer.

🧩 infrastructure

On this layer we implement the needed tools strongly coupled for any kind of technology.

The strategy to follow for this layer is to keep in mind that if during the development process or for future refactors, some element in this layer must be replaced by another one that provides the same or better results, our application can not be affected and even whether it happens, the side effects in our application are really shallow.

To reach that goal, the code included into this layer is divided like that:

  • authentication

    This folder contains the user token management system that in this case is based on JWT.

  • dataSources

    This section contains the whole elements focused on provide a successful application data persistance and retrieving.

    The target of the code included into this folder is to isolate the domain code from the different data access tools that we could have implemented in our application.

    Once exposed the context of this folder content, it will be invoked only by domain services.

    In the same way, this code will only invoke functions defined into the differente data access tools.

    Due to for this application we are using only a single ORM, the whole calls will be done against it.

  • dtos

    The multiple data providers that we can use in our application, will provide different data structures.

    We need to have defined those structure in order to handler that information.

    That information management is bidirectional. It means that we will use those structures in order to receive information from the data sources as well as to send data to them.

    By this reason we implement the DTO (Data Transfer Object) pattern.

  • mappers

    When it's needed to move data from the data sources to the application and viceversa, the data structure must be parsed from DTO to Data Model (when our application consumes data) and from Data Model to DTO (when our application generates data).

    These operations are performed via specific functions whose implement the mapper pattern.

  • orm

    This is obviously the direct access to our data persisted in databases.

    On this case, we are using MongoDB as database engine as well as Mongoose as ORM, so its whole configuration and business logic definition will be done on this folder.

  • server

    This folder contains the complete ExpressJS configuration, including middlewares definitions and API documentation.

  • types

    This folder is specifically bound to the use of TypeScript on this project.

    On this case, the types folder, which contains different types and interfaces definitions, is defined into the infrastructure layer because it contains only data structures used on this layer.

    If there were other types and interfaces definitions that were used in differente layers, it would be possible to create a new types folder into an upper level, for example, into the common folder.

πŸ§ͺ test

The testing strategy selected in this repository, for both cases for unit and integration tests, is to keep them as close as possible to the code that they are checking.

By this reason, you will find several test folders into the different sections of this code.

Webpack is already configured to ignore these files when the code is compiled for production environment.

Once said that, the content of this folder is a set of common tools user along the whole code and the main part are the fixtures used in order to emulate the real running conditions.

πŸ›  Execution environments

Meanwhile we create a new application, we usually need a minimum of two environment: testing and development.

Both environment require specific configurations as well as database presets.

The first requirement is covered by the specific .env files that we configure for every case.

The second one is satisfied in this case by the configuration of different Docker containers that are executed in parallel with the code. It means that the system scrips (defined into the package.json file), are created in order to execute the testing or development database container, depending on the environment we are running.

Both environments are configured in order to be run independently so we can have both up at the same time.

πŸ”₯ Commands guide

βœ… Switch Node version

nvm use

⬇️ Modules installation process

npm i

πŸ§ͺ Run tests

Required files:

  • env/.env.test
# Unit and integration tests.
npm test
# Watch mode.
npm run test:watch
# Coverage.
npm run test:coverage

🏭 Run application in development mode

Required files:

  • env/.env.dev
npm run build:dev

πŸ“œ Generate the manifest file

This feature is focused on provide us information about the application, service or microservice, when we deploy it via Docker container or in any other running situation.

Once we have intalled all the modules (npm i), we can run the next command which will create the manifest.json file in the root of our project.

npm run manifest

The structure of this generated file is that:

{
    "name": "project name that matches with this key in the package.json file",
    "version": "project version that matches with this key in the package.json file",
    "timestamp": "file creation timestamp in ISO format, it means AAAA-MM-DDTHH:MM:SS.sssZ",
    "scm": {
        "remote": "repository remote path that matches with the remote.origin.url key in the project's GIT configuration",
        "branch": "GIT branch seleted when the file was created",
        "commit": "head GIT commit ID when the file was created"
    }
}

This way when we request to the endpoint /__/manifest of our service, we will receive this information.

It's relevant for two reasons: if we receive that data, we know that our service is up and in addition, we get information about what the service contains.

For production purposes don't worry about creating the manifest file before to bundle the code because this command is included in the build:pro script. So you can be sure about when you create the bundle, the most up-to-date information about the service will be available into it.

πŸš€ Build application

Required files:

  • env/.env
npm run build:pro

Once this process is completed, the bundled code is avilable to be included from the dist folder, into the Docker image.

πŸ“— API REST documentation

http://localhost:3600/__/apidoc

The access port must be defined in the environment variables. Take a look to the environment variables section.

πŸ™ Credits and thanks

Thank you so much to the content creator:

Thanks a lot for a so incredible support to:

πŸ“ TODO list

  • Include production configuration to compile and generate Docker container ready to deploy.
  • Include Postman requests to test the API.
  • Include Insomnia requests to test the API.

πŸ”¬ Researching list

πŸ€” Technical debt

Please, check the TECH-DEBT file in this repository in order to keep up-to-date about this subject.

About

This repository is aimed to provide the backend support for the TypeScript training course which it belongs to.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published