Skip to content

Cron management and monitoring, execute your code with superiority.

License

Notifications You must be signed in to change notification settings

Shigoto-Q/shigoto

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Shigoto Logo

Shigoto

Continuous Deploymen ci madewith license format Documentation Status Bandit check

Shigoto Logo

About

A Django project for easy cron job, task planning and monitoring.

Features

  • Backend technologies

    • Django
    • Separate settings for different environments (local/production/testing)
    • Python 3.6 or later
    • Accessible from port 8000 for local development
    • DRF REST API
  • Batteries

    • Docker / Docker Compose integration
    • Automated code-formatting using black and prettier
    • py.test and coverage integration
    • Out-of-the-box configuration for gunicorn, traefik
    • Includes PyCharm project config
  • Celery, for background worker tasks

  • WhiteNoise for serving static files in production.

Installation

  • Clone this repository
  • Install python
  • Install docker and docker-compose follow the instructions for you OS.
  • After installing navigate to the project directory and install the node packages

Running the backend locally is pretty simple, all you have to do is clone the repository and run:

$ docker-compose up

The server will be accessible at port 8000. If you want to have the frontend as well, follow the frontend repository for instructions.

Running ELK Stack

You can run ELK stack in docker swarm

$ docker swarm init
$ docker stack deploy -c docker-elk-stack.yml elk
$ docker stack services elk

Go to localhost:5601 and log in.

Running ELK stack as part of docker-compose

$ COMPOSE_FILE=docker-compose.yml:docker-compose-kibana-optional.yml
$ docker-compose up

Running Grafana and influxDB locally

$ docker-compose-f docker-compose-grafana-chronograf.yml

Running everything together

$ export COMPOSE_FILE=docker-compose.yml:docker-compose-kibana-optional.yml:docker-compose-grafana-chronograf.yml

Sending e-emails

Locally MailHog is configured, meaning every e-mail is rerouted to localhost:18000.

Generating new e-mail tempates

For e-mails we use mjml framework. Simply install the package:

$ npm install

In shigoto/emails/templates/src/... create a directory or if it already exists for that namespace, create .mjml file.

Make sure to import the header.mjml and footer.mjml to the e-mail template.

After you've created the e-mail template, run:

$ mjml -m `emails/templates/src/path/to/template.mjml` -o `emails/templates/generated/path/to/template.html`

After creating the template, add your template in shigoto.emails.constants.EmailTypes.

Make sure to also add a title and a description for the template.

Next,in docker container run:

$ docker-compose exec django bash python manage.py generate_email_templates

This will generate EmailTemplate instance, and last but not least, send the e-mail:

from shigoto_q.emails import services
from shigoto_q.emails.constants import EmailTypes, EmailPriority

services.send_email(
  template_name=EmailTypes.USER_SUBSCRIPTION, 
  priority=EmailPriority.NOW, 
  override_email='simeon.aleksov@shigo.to', 
  context={},
)

Running Tests

To run tests, run the following command

  docker-compose -f local.yml run django pytest

Internal REST Abstract Views

We provide a ResourceView and ResourceListView as an easier way to extend our app without repeating ourselves.

Both of the classes inherit from BaseView which inherits from rest_framework.views.APIView.

Class variables:

  • serializer_dump_class: serializer class that is used to parse response
  • serializer_load_class: serializer to parse request params
  • exception_serializer: default serializer for 400 error
  • owner_check: check whether the owner of the request matches the resource owner
  • permission_classes = [IsAuthenticated]

With these views you can use load and dump serializers.

ResourceListView

An example making GET request to fetch multiple objects:

class DockerImageListView(ResourceListView):
serializer_dump_class = serializers.DockerImageSerializer
serializer_load_class = serializers.DockerImageSerializer
owner_check = True

    def fetch(self, filters, pagination):
        return fetch_and_paginate(
            func=docker_services.list_docker_images,
            filters=filters,
            pagination=pagination,
            serializer_func=DockerImage.from_dict,
            is_serializer_dataclass=True,
        )

fetch() is a callback function for GET requests.

The filters and pagination validation and parsing is handled by the Abstract class.

ResourceView

class DockerImageCreateView(ResourceView):
    serializer_dump_class = UserImageCreateDumpSerializer
    serializer_load_class = UserImageCreateLoadSerializer
    owner_check = True

    def execute(self, data):
        return task_services.create_docker_image(data=data)

execute() is a callback for POST requests.

class DockerImageCreateView(ResourceView):
    serializer_dump_class = DockerViewSerializer
    serializer_load_class = DockerViewSerializer
    owner_check = True

    def find_one(self, data):
        return task_services.get_docker_image(data=data)

find_one() is a callback for GET requests.

Custom camel case serializer

We have a custom serializer, keep snake_case in python, and camelCase in our javascript. Usage:

class DockerImageDeleteSerializer(CamelCaseSerializer):
    some_field = serializers.IntegerField()

When dumping it produces someField, and vice versa, when loading some_field.

Fetching and paginating objects

If you need to return a lot of objects, use fetch_and_paginate()

fetch_and_paginate:
    Args:
        func: typing.Callable,
        filters: dict,
        pagination: Page,
        serializer_func: typing.Union[dataclass, typing.Callable],
        is_serializer_dataclass=False,