A high-performance, production-ready task management REST API built with Rust, featuring advanced optimizations, multi-cloud storage support, and comprehensive API documentation.
- Task Management: Create, read, update, and delete tasks with rich metadata
- File Attachments: Upload and attach files to tasks with multi-cloud storage support
- User Authentication: JWT-based authentication with role-based access control (User/Admin)
- Advanced Filtering: Search, filter, and paginate tasks with flexible query parameters
- Real-time Updates: WebSocket support for live notifications and updates
- Zero-Copy File Handling: Uses
bytes::Bytesfor efficient memory management - Arena Allocation: Custom memory allocator with
bumpalofor temporary objects - Shared String References:
Arc<str>for frequently accessed strings - Redis Caching: Fast in-memory caching for frequently accessed data
- Connection Pooling: Optimized database connection management with SQLx
- Multi-Cloud Storage: Support for AWS S3, Google Cloud Storage, Azure Blob Storage, and Cloudinary
- Presigned URLs: Secure, time-limited file access without authentication
- Local Storage: Fallback option for development and testing
- Background Jobs: Asynchronous job processing for emails and cleanup tasks
- Database Migrations: Automated schema management with SQLx
- Interactive API Documentation: Swagger UI with "Try it out" functionality for all endpoints
- Comprehensive Testing: Benchmarking suite with
wrkandk6 - Makefile Commands: Easy-to-use commands for common tasks
- Validation: Request validation with detailed error messages
- Logging: Structured logging with
tracing
Based on benchmark results:
| Metric | Value |
|---|---|
| Throughput | 7,600+ requests/second |
| P50 Latency | <10ms |
| P95 Latency | <50ms |
| P99 Latency | <100ms |
| Concurrent Users | 100+ (tested) |
- Framework: Axum - Fast, ergonomic web framework
- Database: PostgreSQL with SQLx - Compile-time checked queries
- Cache: Redis - In-memory data store
- Authentication: JWT (JSON Web Tokens)
- Password Hashing: Argon2 - Secure password hashing
- Storage: AWS S3, GCP Cloud Storage, Azure Blob Storage, Cloudinary, or Local
- WebSocket: Real-time communication with Axum WebSocket support
- Documentation: utoipa - OpenAPI/Swagger generation
- Async Runtime: Tokio - Asynchronous runtime
- Rust: 1.70 or higher
- PostgreSQL: 14 or higher
- Redis: 6 or higher (optional, for caching)
- Docker: For running PostgreSQL and Redis (optional)
git clone <repository-url>
cd note-task-apiCreate a .env file in the project root:
# Server Configuration
SERVER_HOST=127.0.0.1
SERVER_PORT=3001
# Database Configuration
DATABASE_URL=postgresql://postgres:password@localhost:5432/note_task_db
# Redis Configuration (optional)
REDIS_URL=redis://localhost:6379
# JWT Configuration
JWT_SECRET=your-super-secret-jwt-key-change-this-in-production
JWT_EXPIRATION=86400
# Storage Configuration
# Options: local, s3, gcs, azure, cloudinary
STORAGE_PROVIDER=local
UPLOAD_DIR=./uploads
# AWS S3 (if using S3)
# AWS_REGION=us-east-1
# AWS_ACCESS_KEY_ID=your-access-key
# AWS_SECRET_ACCESS_KEY=your-secret-key
# S3_BUCKET_NAME=your-bucket-name
# GCP Cloud Storage (if using GCS)
# GCS_BUCKET_NAME=your-bucket-name
# GCS_CREDENTIALS_PATH=./path/to/credentials.json
# Azure Blob Storage (if using Azure)
# AZURE_STORAGE_ACCOUNT=your-account
# AZURE_STORAGE_ACCESS_KEY=your-key
# AZURE_CONTAINER_NAME=your-container
# Cloudinary (if using Cloudinary)
# CLOUDINARY_CLOUD_NAME=your-cloud-name
# CLOUDINARY_API_KEY=your-api-key
# CLOUDINARY_API_SECRET=your-api-secretUsing Docker:
# Start PostgreSQL
docker run -d \
--name postgres \
-e POSTGRES_PASSWORD=password \
-e POSTGRES_DB=note_task_db \
-p 5432:5432 \
postgres:14
# Start Redis
docker run -d \
--name redis \
-p 6379:6379 \
redis:6# Install SQLx CLI
cargo install sqlx-cli --no-default-features --features postgres
# Run migrations
sqlx migrate run# Development mode
cargo run
# Production mode (optimized)
cargo build --release
./target/release/note-task-apiThe API will be available at http://localhost:3001
Open your browser and navigate to:
http://localhost:3001/swagger-ui
Visit the Swagger UI at http://localhost:3001/swagger-ui to:
- View all available endpoints
- Test endpoints directly in the browser
- See request/response schemas
- Try authentication flows
Download the OpenAPI spec at http://localhost:3001/api-docs/openapi.json for:
- Importing into Postman
- Generating client SDKs
- API gateway configuration
- Documentation generation
GET /health- Health checkGET /ping- Ping endpoint
POST /api/v1/auth/register- Register new userPOST /api/v1/auth/login- Login user
POST /api/v1/tasks- Create taskGET /api/v1/tasks- List tasks (with pagination & filters)GET /api/v1/tasks/{id}- Get task by ID
POST /api/v1/files/upload- Upload fileGET /api/v1/files- List user filesGET /api/v1/files/{id}- Get file metadataGET /api/v1/files/{id}/download- Download fileDELETE /api/v1/files/{id}- Delete fileGET /api/v1/files/stats- Get file statisticsGET /api/v1/files/{id}/presigned-url- Generate presigned URL
POST /api/v1/users- Create user (admin only)GET /api/v1/users/{id}- Get user by ID
The API uses JWT (JSON Web Tokens) for authentication.
curl -X POST http://localhost:3001/api/v1/auth/register \
-H "Content-Type: application/json" \
-d '{
"name": "John Doe",
"email": "john@example.com",
"password": "SecurePassword123!"
}'curl -X POST http://localhost:3001/api/v1/auth/login \
-H "Content-Type: application/json" \
-d '{
"email": "john@example.com",
"password": "SecurePassword123!"
}'Response:
{
"token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."
}Include the token in the Authorization header:
curl -X GET http://localhost:3001/api/v1/tasks \
-H "Authorization: Bearer YOUR_JWT_TOKEN"curl -X POST http://localhost:3001/api/v1/tasks \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"title": "Buy groceries",
"description": "Milk, eggs, bread"
}'curl -X POST http://localhost:3001/api/v1/files/upload \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-F "file=@/path/to/file.pdf"# 1. Upload file first
FILE_ID=$(curl -X POST http://localhost:3001/api/v1/files/upload \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-F "file=@document.pdf" | jq -r '.data.id')
# 2. Create task with file attachment
curl -X POST http://localhost:3001/api/v1/tasks \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-H "Content-Type: application/json" \
-d "{
\"title\": \"Review document\",
\"description\": \"Check the attached PDF\",
\"attachment_ids\": [\"$FILE_ID\"]
}"# Get tasks with pagination
curl "http://localhost:3001/api/v1/tasks?page=1&limit=10" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# Filter by status
curl "http://localhost:3001/api/v1/tasks?status=todo" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# Search tasks
curl "http://localhost:3001/api/v1/tasks?search=groceries" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# Combine filters
curl "http://localhost:3001/api/v1/tasks?status=in_progress&page=1&limit=20&sort_by=created_at&sort_direction=desc" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"The project includes a comprehensive benchmarking suite using wrk and k6.
# Install benchmarking tools
make install-benchmark-tools
# Run all benchmarks
make benchmark
# Run specific benchmarks
make benchmark-wrk # HTTP load testing
make benchmark-k6-load # K6 load testing
make benchmark-k6-stress # K6 stress testing
make benchmark-k6-ws # WebSocket testingAfter running benchmarks, results are saved in:
benchmarks/results/wrk-results.txtbenchmarks/results/k6-load-results.txtbenchmarks/results/k6-stress-results.txtbenchmarks/results/k6-ws-results.txt
# Development
make dev # Run in development mode
make build # Build the project
make release # Build optimized release
# Database
make db-setup # Set up database
make db-migrate # Run migrations
make db-reset # Reset database
# Testing
make test # Run tests
make benchmark # Run all benchmarks
# Utilities
make clean # Clean build artifacts
make fmt # Format code
make lint # Run linternote-task-api/
βββ src/
β βββ main.rs # Application entry point
β βββ lib.rs # Library root
β βββ config/ # Configuration management
β β βββ mod.rs
β β βββ settings.rs
β βββ domain/ # Domain models
β β βββ mod.rs
β β βββ user.rs
β β βββ task.rs
β β βββ file.rs
β β βββ error.rs
β β βββ pagination.rs
β βββ handlers/ # HTTP request handlers
β β βββ mod.rs
β β βββ auth_handlers.rs
β β βββ task_handlers.rs
β β βββ file_handlers.rs
β β βββ user_handlers.rs
β β βββ health_handlers.rs
β βββ services/ # Business logic
β β βββ mod.rs
β β βββ auth_service.rs
β β βββ task_service.rs
β β βββ file_service.rs
β β βββ user_service.rs
β β βββ email_service.rs
β βββ repositories/ # Data access layer
β β βββ mod.rs
β β βββ user_repository.rs
β β βββ task_repository.rs
β β βββ file_repository.rs
β βββ middleware/ # Custom middleware
β β βββ mod.rs
β β βββ auth.rs
β β βββ logging.rs
β βββ storage/ # Storage providers
β β βββ mod.rs
β β βββ local.rs
β β βββ s3.rs
β β βββ gcs.rs
β β βββ azure.rs
β β βββ cloudinary.rs
β βββ workers/ # Background jobs
β β βββ mod.rs
β β βββ worker_service.rs
β β βββ job_processor.rs
β βββ websocket/ # WebSocket support
β β βββ mod.rs
β β βββ manager.rs
β βββ cache/ # Redis caching
β β βββ mod.rs
β β βββ redis_cache.rs
β βββ arena/ # Arena allocation
β β βββ mod.rs
β β βββ query_builder.rs
β βββ openapi.rs # OpenAPI/Swagger config
β βββ routes.rs # Route definitions
β βββ validation.rs # Request validation
β βββ extractors.rs # Custom extractors
βββ migrations/ # Database migrations
βββ benchmarks/ # Benchmark scripts
β βββ wrk/
β βββ k6/
βββ Cargo.toml # Rust dependencies
βββ Makefile # Build commands
βββ .env.example # Environment variables template
βββ README.md # This file
Uses bytes::Bytes for efficient memory management:
- No unnecessary data copying
- Reference-counted buffers
- Efficient streaming
Custom memory allocator with bumpalo:
- O(1) allocation
- Bulk deallocation
- Reduced memory fragmentation
Optimized database connections:
- Connection reuse
- Configurable pool size
- Health checks
Fast in-memory caching:
- Frequently accessed data
- Reduced database load
- Configurable TTL
Uses Arc<str> for frequently accessed strings:
- Reduced memory usage
- Efficient cloning
- Thread-safe sharing
The API supports multiple storage providers. Switch between them by changing the STORAGE_PROVIDER environment variable.
STORAGE_PROVIDER=local
UPLOAD_DIR=./uploadsSTORAGE_PROVIDER=s3
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
S3_BUCKET_NAME=your-bucket-nameSTORAGE_PROVIDER=gcs
GCS_BUCKET_NAME=your-bucket-name
GCS_CREDENTIALS_PATH=./credentials.jsonSTORAGE_PROVIDER=azure
AZURE_STORAGE_ACCOUNT=your-account
AZURE_STORAGE_ACCESS_KEY=your-key
AZURE_CONTAINER_NAME=your-containerSTORAGE_PROVIDER=cloudinary
CLOUDINARY_CLOUD_NAME=your-cloud-name
CLOUDINARY_API_KEY=your-api-key
CLOUDINARY_API_SECRET=your-api-secret- Password Hashing: Argon2 algorithm with salt
- JWT Authentication: Secure token-based auth
- Role-Based Access Control: User and Admin roles
- Input Validation: Comprehensive request validation
- SQL Injection Prevention: Compile-time checked queries with SQLx
- CORS: Configurable cross-origin resource sharing
- Rate Limiting: (TODO) Prevent abuse
# Check if PostgreSQL is running
docker ps | grep postgres
# Check connection
psql -h localhost -U postgres -d note_task_db# Check if Redis is running
docker ps | grep redis
# Test connection
redis-cli ping# Find process using port 3001
lsof -i :3001
# Kill the process
kill -9 <PID>Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
Built with β€οΈ using Rust
- Rust community for excellent crates and documentation
- Axum team for the amazing web framework
- SQLx team for compile-time checked queries
- utoipa team for OpenAPI generation
Happy Coding! π