diff --git a/README.md b/README.md
index a97ab47..7cedc91 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,14 @@
-# FastAPI-boilerplate
->A template to speed your FastAPI development up.
+
Fast FastAPI boilerplate
+
+ Yet another template to speed your FastAPI development up.
+
+
+
+
+
+
+
+
## 0. About
**FastAPI boilerplate** creates an extendable async API using FastAPI, Pydantic V2, SQLAlchemy 2.0 and PostgreSQL:
@@ -7,7 +16,8 @@
- [`Pydantic V2`](https://docs.pydantic.dev/2.4/): the most widely used data validation library for Python, now rewritten in Rust [`(5x to 50x speed improvement)`](https://docs.pydantic.dev/latest/blog/pydantic-v2-alpha/)
- [`SQLAlchemy 2.0`](https://docs.sqlalchemy.org/en/20/changelog/whatsnew_20.html): Python SQL toolkit and Object Relational Mapper
- [`PostgreSQL`](https://www.postgresql.org): The World's Most Advanced Open Source Relational Database
-- [`Redis`](https://redis.io): The open source, in-memory data store used by millions of developers as a database, cache, streaming engine, and message broker.
+- [`Redis`](https://redis.io): The open source, in-memory data store used by millions of developers as a database, cache, streaming engine, and message broker
+- [`ARQ`](https://arq-docs.helpmanual.io) Job queues and RPC in python with asyncio and redis.
## 1. Features
- Fully async
@@ -15,18 +25,13 @@
- User authentication with JWT
- Easy redis caching
- Easy client-side caching
+ - ARQ integration for task queue
- Easily extendable
- Flexible
-### 1.1 To do
-- [x] Redis cache
-- [ ] Arq job queues
-- [x] App settings (such as database connection, etc) only for what's inherited in core.config.Settings
-
## 2. Contents
0. [About](#0-about)
1. [Features](#1-features)
- 1. [To do](#11-to-do)
2. [Contents](#2-contents)
3. [Usage](#3-usage)
4. [Requirements](#4-requirements)
@@ -39,15 +44,17 @@
7. [Creating the first superuser](#7-creating-the-first-superuser)
8. [Database Migrations](#8-database-migrations)
9. [Extending](#9-extending)
- 1. [Database Model](#91-database-model)
- 2. [SQLAlchemy Models](#92-sqlalchemy-model)
- 3. [Pydantic Schemas](#93-pydantic-schemas)
- 4. [Alembic Migrations](#94-alembic-migration)
- 5. [CRUD](#95-crud)
- 6. [Routes](#96-routes)
- 7. [Caching](#97-caching)
- 8. [More Advanced Caching](#98-more-advanced-caching)
- 9. [Running](#99-running)
+ 1. [Project Structure](#91-project-structure)
+ 2. [Database Model](#92-database-model)
+ 3. [SQLAlchemy Models](#93-sqlalchemy-model)
+ 4. [Pydantic Schemas](#94-pydantic-schemas)
+ 5. [Alembic Migrations](#95-alembic-migration)
+ 6. [CRUD](#96-crud)
+ 7. [Routes](#97-routes)
+ 8. [Caching](#98-caching)
+ 9. [More Advanced Caching](#99-more-advanced-caching)
+ 10. [ARQ Job Queues](#910-arq-job-queues)
+ 11. [Running](#911-running)
10. [Testing](#10-testing)
11. [Contributing](#11-contributing)
12. [References](#12-references)
@@ -68,7 +75,7 @@ Then install poetry:
pip install poetry
```
-In the **src** directory, run to install required packages:
+In the `src` directory, run to install required packages:
```sh
poetry install
```
@@ -131,10 +138,18 @@ REDIS_CACHE_PORT=6379
And for client-side caching:
```
-# ------------- redis -------------
+# ------------- redis cache -------------
REDIS_CACHE_HOST="your_host" # default localhost
REDIS_CACHE_PORT=6379
```
+
+For ARQ Job Queues:
+```
+# ------------- redis queue -------------
+REDIS_CACHE_HOST="your_host" # default localhost
+REDIS_CACHE_PORT=6379
+```
+
___
## 5. Running Databases With Docker:
### 5.1 PostgreSQL (main database)
@@ -165,7 +180,7 @@ docker run -d \
[`If you didn't create the .env variables yet, click here.`](#environment-variables)
-### 5.2 Redis (for caching)
+### 5.2 Redis (for caching and job queue)
Install docker if you don't have it yet, then run:
```sh
docker pull redis:alpine
@@ -218,11 +233,80 @@ poetry run alembic upgrade head
___
## 9. Extending
-### 9.1 Database Model
+### 9.1 Project Structure
+```sh
+.
+├── .env # Environment variables file for configuration and secrets.
+├── __init__.py # An initialization file for the package.
+├── alembic.ini # Configuration file for Alembic (database migration tool).
+├── app # Main application directory.
+│ ├── __init__.py # Initialization file for the app package.
+│ ├── api # Folder containing API-related logic.
+│ │ ├── __init__.py # Initialization file for the api package.
+│ │ ├── dependencies.py # Defines dependencies that can be reused across the API endpoints.
+│ │ ├── exceptions.py # Contains custom exceptions for the API.
+│ │ └── v1 # Version 1 of the API.
+│ │ ├── __init__.py # Initialization file for the v1 package.
+│ │ ├── login.py # API routes related to user login.
+│ │ ├── posts.py # API routes related to posts.
+│ │ ├── tasks.py # API routes related to background tasks.
+│ │ └── users.py # API routes related to user management.
+│ │
+│ ├── core # Core utilities and configurations for the application.
+│ │ ├── __init__.py # Initialization file for the core package.
+│ │ ├── cache.py # Utilities related to caching.
+│ │ ├── config.py # Application configuration settings.
+│ │ ├── database.py # Database connectivity and session management.
+│ │ ├── exceptions.py # Contains core custom exceptions for the application.
+│ │ ├── models.py # Base models for the application.
+│ │ ├── queue.py # Utilities related to task queues.
+│ │ └── security.py # Security utilities like password hashing and token generation.
+│ │
+│ ├── crud # CRUD operations for the application.
+│ │ ├── __init__.py # Initialization file for the crud package.
+│ │ ├── crud_base.py # Base CRUD operations class that can be extended by other CRUD modules.
+│ │ ├── crud_posts.py # CRUD operations for posts.
+│ │ └── crud_users.py # CRUD operations for users.
+│ │
+│ ├── main.py # Entry point for the FastAPI application.
+│ │
+│ ├── models # ORM models for the application.
+│ │ ├── __init__.py # Initialization file for the models package.
+│ │ ├── post.py # ORM model for posts.
+│ │ └── user.py # ORM model for users.
+│ │
+│ ├── schemas # Pydantic schemas for data validation.
+│ │ ├── __init__.py # Initialization file for the schemas package.
+│ │ ├── job.py # Schemas related to background jobs.
+│ │ ├── post.py # Schemas related to posts.
+│ │ └── user.py # Schemas related to users.
+│ │
+│ └── worker.py # Worker script for handling background tasks.
+│
+├── migrations # Directory for Alembic migrations.
+│ ├── README # General info and guidelines for migrations.
+│ ├── env.py # Environment configurations for Alembic.
+│ ├── script.py.mako # Template script for migration generation.
+│ └── versions # Folder containing individual migration scripts.
+│ └── README.MD # Readme for the versions directory.
+│
+├── poetry.lock # Lock file for Poetry, ensuring consistent dependencies.
+├── pyproject.toml # Configuration file for Poetry, lists project dependencies.
+├── scripts # Utility scripts for the project.
+│ └── create_first_superuser.py # Script to create the first superuser in the application.
+│
+└── tests # Directory containing all the tests.
+ ├── __init__.py # Initialization file for the tests package.
+ ├── conftest.py # Configuration and fixtures for pytest.
+ ├── helper.py # Helper functions for writing tests.
+ └── test_user.py # Tests related to the user model and endpoints.
+```
+
+### 9.2 Database Model
Create the new entities and relationships and add them to the model

-### 9.2 SQLAlchemy Model
+### 9.3 SQLAlchemy Model
Inside `app/models`, create a new `entity.py` for each new entity (replacing entity with the name) and define the attributes according to [SQLAlchemy 2.0 standards](https://docs.sqlalchemy.org/en/20/orm/mapping_styles.html#orm-mapping-styles):
```python
from sqlalchemy import String, DateTime
@@ -240,7 +324,7 @@ class Entity(Base):
...
```
-### 9.3 Pydantic Schemas
+### 9.4 Pydantic Schemas
Inside `app/schemas`, create a new `entity.py` for for each new entity (replacing entity with the name) and create the schemas according to [Pydantic V2](https://docs.pydantic.dev/latest/#pydantic-examples) standards:
```python
from typing import Annotated
@@ -280,7 +364,7 @@ class EntityDelete(BaseModel):
```
-### 9.4 Alembic Migration
+### 9.5 Alembic Migration
Then, while in the `src` folder, run Alembic migrations:
```sh
poetry run alembic revision --autogenerate
@@ -291,7 +375,7 @@ And to apply the migration
poetry run alembic upgrade head
```
-### 9.5 CRUD
+### 9.6 CRUD
Inside `app/crud`, create a new `crud_entities.py` inheriting from `CRUDBase` for each new entity:
```python
from app.crud.crud_base import CRUDBase
@@ -302,7 +386,7 @@ CRUDEntity = CRUDBase[Entity, EntityCreateInternal, EntityUpdate, EntityUpdateIn
crud_entity = CRUDEntity(Entity)
```
-### 9.6 Routes
+### 9.7 Routes
Inside `app/api/v1`, create a new `entities.py` file and create the desired routes
```python
from typing import Annotated
@@ -333,7 +417,7 @@ router = APIRouter(prefix="/v1") # this should be there already
router.include_router(entity_router)
```
-### 9.7 Caching
+### 9.8 Caching
The `cache` decorator allows you to cache the results of FastAPI endpoint functions, enhancing response times and reducing the load on your application by storing and retrieving data in a cache.
Caching the response of an endpoint is really simple, just apply the `cache` decorator to the endpoint function.
@@ -381,7 +465,7 @@ In this case, what will happen is:
Passing resource_id_name is usually preferred.
-### 9.8 More Advanced Caching
+### 9.9 More Advanced Caching
The behaviour of the `cache` decorator changes based on the request method of your endpoint.
It caches the result if you are passing it to a **GET** endpoint, and it invalidates the cache with this key_prefix and id if passed to other endpoints (**PATCH**, **DELETE**).
@@ -437,11 +521,53 @@ async def patch_post(
```
> **Warning**
-> Note that this will not work for **GET** requests.
+> Note that adding `to_invalidate_extra` will not work for **GET** requests.
+#### Client-side Caching
For `client-side caching`, all you have to do is let the `Settings` class defined in `app/core/config.py` inherit from the `ClientSideCacheSettings` class. You can set the `CLIENT_CACHE_MAX_AGE` value in `.env,` it defaults to 60 (seconds).
-### 9.9 Running
+### 9.10 ARQ Job Queues
+Create the background task in `app/worker.py`:
+```python
+...
+# -------- background tasks --------
+async def sample_background_task(ctx, name: str) -> str:
+ await asyncio.sleep(5)
+ return f"Task {name} is complete!"
+```
+
+Then add the function to the `WorkerSettings` class `functions` variable:
+```python
+# -------- class --------
+...
+class WorkerSettings:
+ functions = [sample_background_task]
+ ...
+```
+
+Add the task to be enqueued in a **POST** endpoint and get the info in a **GET**:
+```python
+...
+@router.post("/task", response_model=Job, status_code=201)
+async def create_task(message: str):
+ job = await queue.pool.enqueue_job("sample_background_task", message)
+ return {"id": job.job_id}
+
+
+@router.get("/task/{task_id}")
+async def get_task(task_id: str):
+ job = ArqJob(task_id, queue.pool)
+ return await job.info()
+
+```
+
+And finally run the worker in parallel to your fastapi application.
+While in the `src` folder:
+```sh
+poetry run arq app.worker.WorkerSettings
+```
+
+### 9.11 Running
While in the `src` folder, run to start the application with uvicorn server:
```sh
poetry run uvicorn app.main:app --reload
@@ -481,7 +607,7 @@ Contributions are appreciated, even if just reporting bugs, documenting stuff or
This project was inspired by a few projects, it's based on them with things changed to the way I like (and pydantic, sqlalchemy updated)
* [`Full Stack FastAPI and PostgreSQL`](https://github.com/tiangolo/full-stack-fastapi-postgresql) by @tiangolo himself
* [`FastAPI Microservices`](https://github.com/Kludex/fastapi-microservices) by @kludex which heavily inspired this boilerplate
-* [`Async Web API with FastAPI + SQLAlchemy 2.0`](https://github.com/rhoboro/async-fastapi-sqlalchemy)
+* [`Async Web API with FastAPI + SQLAlchemy 2.0`](https://github.com/rhoboro/async-fastapi-sqlalchemy) for sqlalchemy 2.0 ORM examples
## 13. License
[`MIT`](LICENSE.md)
diff --git a/src/app/api/v1/__init__.py b/src/app/api/v1/__init__.py
index 663ef59..5a52cf4 100644
--- a/src/app/api/v1/__init__.py
+++ b/src/app/api/v1/__init__.py
@@ -3,8 +3,10 @@
from app.api.v1.login import router as login_router
from app.api.v1.users import router as users_router
from app.api.v1.posts import router as posts_router
+from app.api.v1.tasks import router as tasks_router
router = APIRouter(prefix="/v1")
router.include_router(login_router)
router.include_router(users_router)
router.include_router(posts_router)
+router.include_router(tasks_router)
diff --git a/src/app/api/v1/tasks.py b/src/app/api/v1/tasks.py
new file mode 100644
index 0000000..e05395a
--- /dev/null
+++ b/src/app/api/v1/tasks.py
@@ -0,0 +1,19 @@
+from arq.jobs import Job as ArqJob
+from fastapi import APIRouter, HTTPException
+
+from app.core import queue
+from app.schemas.job import Job
+
+router = APIRouter(prefix="/tasks", tags=["Tasks"])
+
+
+@router.post("/task", response_model=Job, status_code=201)
+async def create_task(message: str):
+ job = await queue.pool.enqueue_job("sample_background_task", message)
+ return {"id": job.job_id}
+
+
+@router.get("/task/{task_id}")
+async def get_task(task_id: str):
+ job = ArqJob(task_id, queue.pool)
+ return await job.info()
diff --git a/src/app/api/v1/users.py b/src/app/api/v1/users.py
index b9ef808..b02c71f 100644
--- a/src/app/api/v1/users.py
+++ b/src/app/api/v1/users.py
@@ -5,7 +5,7 @@
from fastapi import Request
import fastapi
-from app.schemas.user import UserCreate, UserCreateInternal, UserUpdate, UserRead, UserBase
+from app.schemas.user import UserCreate, UserCreateInternal, UserUpdate, UserRead
from app.api.dependencies import get_current_user, get_current_superuser
from app.core.database import async_get_db
from app.core.security import get_password_hash
diff --git a/src/app/core/config.py b/src/app/core/config.py
index 7256ad4..c485291 100644
--- a/src/app/core/config.py
+++ b/src/app/core/config.py
@@ -76,6 +76,11 @@ class ClientSideCacheSettings(BaseSettings):
CLIENT_CACHE_MAX_AGE: int = config("CLIENT_CACHE_MAX_AGE", default=60)
+class RedisQueueSettings(BaseSettings):
+ REDIS_QUEUE_HOST: str = config("REDIS_QUEUE_HOST", default="localhost")
+ REDIS_QUEUE_PORT: str = config("REDIS_QUEUE_PORT", default=6379)
+
+
class Settings(
AppSettings,
PostgresSettings,
@@ -83,7 +88,8 @@ class Settings(
FirstUserSettings,
TestSettings,
RedisCacheSettings,
- ClientSideCacheSettings
+ ClientSideCacheSettings,
+ RedisQueueSettings
):
pass
diff --git a/src/app/core/queue.py b/src/app/core/queue.py
new file mode 100644
index 0000000..7084037
--- /dev/null
+++ b/src/app/core/queue.py
@@ -0,0 +1,3 @@
+from arq.connections import ArqRedis
+
+pool: ArqRedis | None = None
diff --git a/src/app/main.py b/src/app/main.py
index b788a6a..10c3371 100644
--- a/src/app/main.py
+++ b/src/app/main.py
@@ -1,11 +1,20 @@
from fastapi import FastAPI
import redis.asyncio as redis
+from arq import create_pool
+from arq.connections import RedisSettings
+from app.api import router
+from app.core import cache, queue
from app.core.database import Base
from app.core.database import async_engine as engine
-from app.core.config import settings, DatabaseSettings, RedisCacheSettings, AppSettings, ClientSideCacheSettings
-from app.api import router
-from app.core import cache
+from app.core.config import (
+ settings,
+ DatabaseSettings,
+ RedisCacheSettings,
+ AppSettings,
+ ClientSideCacheSettings,
+ RedisQueueSettings
+)
# -------------- database --------------
async def create_tables():
@@ -23,6 +32,17 @@ async def close_redis_cache_pool():
await cache.client.close()
+# -------------- queue --------------
+async def create_redis_queue_pool():
+ queue.pool = await create_pool(
+ RedisSettings(host=settings.REDIS_QUEUE_HOST, port=settings.REDIS_QUEUE_PORT)
+ )
+
+
+async def close_redis_queue_pool():
+ await queue.pool.close()
+
+
# -------------- application --------------
def create_application() -> FastAPI:
if isinstance(settings, AppSettings):
@@ -47,6 +67,10 @@ def create_application() -> FastAPI:
if isinstance(settings, ClientSideCacheSettings):
application.add_middleware(cache.ClientCacheMiddleware, max_age=60)
+ if isinstance(settings, RedisQueueSettings):
+ application.add_event_handler("startup", create_redis_queue_pool)
+ application.add_event_handler("shutdown", close_redis_queue_pool)
+
return application
diff --git a/src/app/schemas/job.py b/src/app/schemas/job.py
new file mode 100644
index 0000000..d3a3600
--- /dev/null
+++ b/src/app/schemas/job.py
@@ -0,0 +1,4 @@
+from pydantic import BaseModel
+
+class Job(BaseModel):
+ id: str
diff --git a/src/app/worker.py b/src/app/worker.py
new file mode 100644
index 0000000..b4c8ad4
--- /dev/null
+++ b/src/app/worker.py
@@ -0,0 +1,37 @@
+import asyncio
+import uvloop
+from arq.connections import RedisSettings
+
+from app.core.config import settings
+
+asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
+
+REDIS_QUEUE_HOST = settings.REDIS_QUEUE_HOST
+REDIS_QUEUE_PORT = settings.REDIS_QUEUE_PORT
+
+
+# -------- background tasks --------
+async def sample_background_task(ctx, name: str) -> str:
+ await asyncio.sleep(5)
+ return f"Task {name} is complete!"
+
+
+# -------- base functions --------
+async def startup(ctx):
+ print("worker start")
+
+
+async def shutdown(ctx):
+ print("worker end")
+
+
+# -------- class --------
+class WorkerSettings:
+ functions = [sample_background_task]
+ redis_settings = RedisSettings(
+ host=REDIS_QUEUE_HOST,
+ port=REDIS_QUEUE_PORT
+ )
+ on_startup = startup
+ on_shutdown = shutdown
+ handle_signals = False
diff --git a/src/pyproject.toml b/src/pyproject.toml
index 247c8d0..f9fedf4 100644
--- a/src/pyproject.toml
+++ b/src/pyproject.toml
@@ -28,8 +28,8 @@ python-decouple = "^3.8"
greenlet = "^2.0.2"
httpx = "^0.25.0"
pydantic-settings = "^2.0.3"
-arq = "^0.25.0"
redis = "^5.0.1"
+arq = "^0.25.0"
[build-system]