A full production backend API built with these tech stacks:
- REST API: Flask and Flask-RESTFul.
- Database: PostgresSQL.
- Unit Testing: Pytest.
- Packaging Management: Poetry.
- Containerization: Docker and Docker Compose.
- Cloud Provider: Microsoft Azure:
- Azure VNet (Virtual Network).
- Azure Virtual Machine.
- Azure Database for PostgreSQL.
- Azure Blob Storage.
- Azure Container Registry (ACR).
- Infrastructure as Code: Terraform.
- CI/CD: GitHub Actions.
- Version Control: Git and GitHub.
Set the environment variables:
- Copy
backend/.env.sample/
folder and rename it tobackend/.env/
.
Run the base environment locally:
- Update the
backend/.env/.env.base
file. - Run Docker Compose:
docker compose -f backend/.docker-compose/base.yml up -d --build
- Run Pytest:
docker exec -it kodeec_base_flask /bin/bash -c "/opt/venv/bin/pytest"
Run the production environment locally:
- Get the environment variables from the infrastructure:
python scripts/get_infra_output.py --c=infrastructure/.docker-compose.yml --m=azure --f=env
- Update the
backend/.env/.env.production
file. - Run Docker Compose:
docker compose -f backend/.docker-compose/production.yml up -d --build
Setup Terraform Backend:
- Create a storage on Azure Blob Storage.
export resource_group=terraform-resource-group; export location=eastus; export storage_account=myterraformbackend; export container=terraform-backend; az group create --name $resource_group --location $location; az storage account create \ --name $storage_account \ --resource-group $resource_group \ --location $location \ --sku Standard_RAGRS \ --kind StorageV2; az storage container create \ --account-name $storage_account \ --name $container \ --auth-mode login;
- Authenticating to Azure Provider by following this link.
- Create a file and name it to
.backend.hcl
underinfrastructure
folder. - Copy the content of file
.backend.hcl.sample
inside it and fill the values.
Setup Secrets:
- Create a file with the name
.secrets.auto.tfvars
underinfrastructure
folder. - Copy the contents of file
.secrets.auto.tfvars.sample
inside it and fill the values.
Setup SSH:
- Generate an SSH Key.
- Create a folder with the name
.ssh
underinfrastructure
folder. - Copy
id_rsa.pub
andid_rsa
file toinfrastructure/.ssh
.
Run Terraform Commands:
-
terraform init
docker compose -f infrastructure/.docker-compose.yml run --rm terraform init -backend-config=.backend.hcl
-
terraform plan all
docker compose -f infrastructure/.docker-compose.yml run --rm terraform plan
-
terraform plan azure
docker compose -f infrastructure/.docker-compose.yml run --rm terraform plan -target="module.azure"
-
terraform plan github
docker compose -f infrastructure/.docker-compose.yml run --rm terraform plan -target="module.github" --auto-approve
-
terraform apply all
docker compose -f infrastructure/.docker-compose.yml run --rm terraform apply --auto-approve
-
terraform apply azure
docker compose -f infrastructure/.docker-compose.yml run --rm terraform apply -target="module.azure" --auto-approve
-
terraform apply github
docker compose -f infrastructure/.docker-compose.yml run --rm terraform apply -target="module.github" --auto-approve
-
terraform destroy all
docker compose -f infrastructure/.docker-compose.yml run --rm terraform destroy --auto-approve
-
terraform destroy azure
docker compose -f infrastructure/.docker-compose.yml run --rm terraform destroy -target="module.azure" --auto-approve
-
terraform destroy github
docker compose -f infrastructure/.docker-compose.yml run --rm terraform destroy -target="module.github" --auto-approve
-
terraform output azure
docker compose -f infrastructure/.docker-compose.yml run --rm terraform output azure
-
Create the Azure resources by following the infrastructure section.
-
Export values and change them according to your infrastructure:
export ACR_URL=<YOUR_ACR_URL>; export ACR_USERNAME=<YOUR_ACR_USERNAME>; export ACR_PASSWORD=<YOUR_ACR_PASSWORD>; export IMG_NAME=<your_image_name>; export IMG_TAG=<your_image_tag>; export FINAL_IMAGE=$ACR_USERNAME.azurecr.io/$IMG_NAME:$IMG_TAG; export ENVIRONMENT=production; export MACHINE_IP=<YOUR_MACHINE_IP>; export MACHINE_USER=<YOUR_MACHINE_USER>;
-
Login to Azure Container Registry:
docker login $ACR_URL -u $ACR_USERNAME -p $ACR_PASSWORD
-
Build a Docker image:
docker build -t $FINAL_IMAGE -f backend/Dockerfile backend --build-arg ENVIRONMENT=$ENVIRONMENT
-
Push the Docker image to Azure Container Registry:
docker push $FINAL_IMAGE
-
Copy the env file and the run script to the server:
rsync backend/.env/.env.$ENVIRONMENT scripts/run_backend.py $MACHINE_USER@$MACHINE_IP:/home/$MACHINE_USER
-
Login to Azure Container Registry on the server:
ssh $MACHINE_USER@$MACHINE_IP "docker login $ACR_URL -u $ACR_USERNAME -p $ACR_PASSWORD"
-
Run the script on the server:
ssh $MACHINE_USER@$MACHINE_IP "python3 run_backend.py --env=.env.$ENVIRONMENT --image=$FINAL_IMAGE"
- Get the environment variables from the infrastructure:
python scripts/get_infra_output.py --c=infrastructure/.docker-compose.yml --m=azure --f=github
- Update
infrastructure/.secrets.auto.tfvars
file with the new values. - Apply the
github
module in the infrastructure. - Update the GitHub Actions file at
.github/workflows/deploy.yml
.