Welcome to the repository of Aleš Lončar Blog. This repository contains the source code and content for my personal/professional blog. Here you'll find how to set up your development environment, clone the repository, and create your own blog.
To clone this repository and start contributing, follow these steps:
git clone https://github.com/loncarales/loncar.net.git
cd loncar.net
This project uses Dev Containers to ensure a consistent development environment, eliminating the need for local installations.
Ensure you have the following installed:
Once you have set up Docker, VS Code, and the Dev Containers extension, you can start working within a containerized development environment that mirrors the project's setup.
Within the dev container, there are predefined tasks in .vscode/tasks.json
to facilitate development:
- Test Task: Launches a local Hugo server with draft content included. This is useful for previewing changes in real-time as you develop. To run this task, execute:
hugo server -D
This command serves your site locally at http://localhost:1313 (default Hugo server address) and includes draft posts (-D flag).
- Build Task: Builds the static site with minification and sets the log level to info. This task mirrors the build process used in deployment. To run this task, execute:
hugo --minify --logLevel info
This command prepares your site for production by minifying resources and providing informational logs during the build.
You can execute these tasks manually via the terminal within the dev container or use VS Code's integrated task runner:
- Open the Command Palette (Ctrl+Shift+P on Windows/Linux, Cmd+Shift+P on macOS).
- Type
Tasks: Run
and - Choose either the
Run Test Task
orRun Build Task
from the list.
This setup ensures that you can easily test and build your blog within a consistent environment, closely mimicking the CI/CD process and reducing the chances of environment-specific bugs.
- Hugo: I use Hugo, a fast and modern static site generator, to build the blog. It turns Markdown files into an entire static website.
- AWS S3: The static pages generated by Hugo are hosted on an AWS S3 bucket, which is configured for website hosting.
- CloudFlare: CloudFlare is used for DNS resolution, pointing the domain to the correct AWS S3 bucket. It also manages SSL/TLS certificates, providing secure access to the blog.
- bunny.net: I use bunny.net as CDN to deliver media content quickly and efficiently worldwide. The CDN is configured with a custom hostname to align with branding.
- AWS S3 Bucket for Media: Separate from the static page hosting, I use another S3 bucket dedicated to storing and serving media content (images, videos) in the blog posts.
Deployment is automated using GitHub Actions, which builds the blog with Hugo and synchronizes the generated static files to the AWS S3 hosting bucket.
Workflow consists of the following steps:
- Checkout: Checkout repository with additional submodules (Hugo theme)
- Build: Set up and run Hugo to build the static site from the source.
- Deploy: Sync the built site to the AWS S3 bucket using the Python S3cmd CLI tool. The command used is
s3cmd sync --acl-public --recursive --delete-removed public/ s3://loncar.net
This command makes the uploaded files public (--acl-public) and recursively uploads all files within the public/ directory.
The workflow is triggered on pushes to the main branch or pull request events, ensuring the live site is always up-to-date with the latest changes.
I explicitly created a dedicated AWS IAM user for this deployment process using the principles of zero-trust architecture. This user is assigned only the necessary permissions to perform the sync operation, adhering to the principle of least privilege.
The policy attached to this user is as follows:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::loncar.net",
"arn:aws:s3:::loncar.net/*"
]
}
]
}
This policy ensures the user can list all buckets (needed by the S3cmd tool to find the correct one for syncing) and manage objects within the loncar.net
bucket, including setting ACLs to make objects public.
To test the GitHub Actions workflow locally, I use act. After installing the act, you can simulate the workflow by running the following command in the repository root:
act -W .github/workflows/publish.yml --secret-file .secrets
You need to provide the required secrets in the .secrets file.
This allows me to catch any potential issues in the CI/CD process before pushing changes.
This project is licensed under the MIT License.