Skip to content

Template for python project with continuous integration in Azure

Notifications You must be signed in to change notification settings

andresliszt/cookiecutter-azure-poetry

Repository files navigation

cokiecutter-azure-poetry

Python 3.8 Code Style

Cookiecutter template for managing project with poetry with continuous integration in Azure.

Docker multistage build for testing, build documentation with sphinx and publish package as Azure artifact using twine authentication.

Features

Azure pipelines

Simple continiuous integration using Docker Tasks to test and publish as an Azure Artifact. The Dockerfile contains 3 steps, the first one called base install only productive dependencies, the second one called tester copy inside the tests folder and run pytest and finally the last called publisher build the package wheel and publish into the private repository. This last step only makes sense to be executed inside the pipeline because it uses a file that the twine authentication creates at ${PYPIRC_PATH}, which contain the needed credentials. The build is not done using the command poetry build but rather it goes through the setup.py file.

Pydantic settings

Project settings are controlled by pydantic at my_package.settings.py. If an env var is need to be set you have to edit the class my_package.settings.Settings, for example, suposse you have to add a connection string for some database system as an env var, then

# Other imports are here but we don't write them in this example

from pydantic import SecretStr

class Settings(BaseSettings):
    """Project settings variables."""

    PACKAGE_PATH = Path(__file__).parent
    """Package path (python files)."""

    PROJECT_PATH = PACKAGE_PATH.parent
    """Project path (all files)."""
    
    LOG_PATH: Optional[Path]
    """Path to logfile, only works if ``LOG_DESTINATION=FILE``."""
    
    LOG_FORMAT: LogFormatter = LogFormatter.COLOR.value
    """Log style."""
    
    LOG_LEVEL: LogLevel = LogLevel.INFO.value
    """Log level from ``logging`` module."""
    
    LOG_DESTINATION: LogDest = LogDest.CONSOLE.value
    """Destination for logs."""
    
    CONNECTION_STRING: SecretStr # This is a new mandatory env var

    class Config:
        """Inner configuration."""

        env_prefix = "MY_PACKAGE_" # All env vars with this prefix
        use_enum_values = True

You have 3 options to set the value for CONNECTION_STRING, the first one is write inside the class definition (not recommended for passwords!), the second one is use export command or set in Windows to set the value of the variable and the easiest is to write it in the configuration file .env at the root of the project. Environment variables are prefixed with MY_PACKAGE_, i.e. the name of the package in uppercase. Example of .env file with a simple postgres connection string

# .env file

MY_PACKAGE_CONNECTION_STRING=postgresql://foo:bar@localhost:5432/mydatabase

The project settings are in the top level __init__, to use it you have to import SETTINGS object

from sqlalchemy import create_engine

from my_package import SETTINGS # Here SETTINGS is an instance of ``my_package.settings.Settings``

engine = create_engine(SETTINGS.CONNECTION_STRING.get_secret_value()) # We declared the connection as SecretStr!

Tox

Tox is use to create virtual environment, to clear all temporary files (as __pycache__), to format the code and lint the code. Here the list of avialables commands

# Creates a virtual environment using the python version declared in the build of cookiecutter, obiously must exist in the system otherwise error will be raised
tox -e venv

# Delete all temporary files
tox -e clear

# Format code using black, isort, docformatter and autopep8
tox -e format

# Lint code using pylint, mypy, bandit and flake8
tox -e lint

⚠️ The virtual environment must be created with tox -e venv to use other commands, otherwise could be unexpected behavior!

Logging

A nice logger is configured using structlog. The logger can be controlled in the setting class through the attributes LOG_FORMAT with possible values JSON or COLOR (the last requires colorama). LOG_DESTINATION with possible values CONSOLE or FILE, if FILE is seted must also be seted LOG_PATH, the file where to save. LOG_LEVEL, which is the level of the logger and is the same of logging module.

Also the logger is

# .env file

MY_PACKAGE_LOG_LEVEL=20
MY_PACKAGE_LOG_DESTINATION=CONSOLE
MY_PACKAGE_LOG_FORMAT=JSON

The logger admits any kwarg argument to add more information, for example printing in console

from my_package import logger

logger.info("This is an exmaple log", example_kwargs = "this is an example kwarg", other_example_kwargs = "this is other example kwarg")

>>> 2022-03-29T18:40:54.646488Z [info     ] This is an exmaple log         [pymooslotting] example_kwargs=this is an example kwarg other_example_kwargs=this is other example kwarg

Other levels are logger.warning, logger.debug and logger.error.

Custom Exceptions

There is a file exc.py with a custom exception class that the classes that inherit from this, allows them to create an error message that takes the kwargs and renders them inside the message

# my_package.exc.py

class ExampleError(ErrorMixin, NameError):
    """Raise this when a table name has not been found."""

    msg_template = "This is an example class error, every `{variable}` inside the brackets, will be part of the instantiation of the class and will be render in this message."

TODO

  • Test the cookiecutter project.
  • Add public option for CI/CD like travis or github actions.
  • Add sphinx into CI/CD to publish documentation.
  • Write a better readme.
  • Write a better TODO.

About

Template for python project with continuous integration in Azure

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published