A minimal, production-minded AI platform demonstrating an internal Python SDK, observability, evaluation, containerisation, and deployment scaffolding.
This repository demonstrates a reusable AI platform architecture with two core components:
A modular, reusable Python library that serves as the foundation for AI capabilities across the organization. It provides:
- π A unified
AIClientinterface - π Text summarisation capability (
simpleand HuggingFace backends) - π Built-in observability via Langfuse
- βοΈ Clean configuration patterns
- π§° Reusable utilities for downstream services
In production: This would be a separate repository published as a Python package (PyPI, Artifactory, etc.) that multiple services consume.
A lightweight microservice that demonstrates how services consume the SDK:
GET /healthβ Health checkPOST /summariseβ Text summarisation endpoint
This structure demonstrates Shared SDK β Multiple Services, Testing + LLM Evaluation, and GitOps-ready packaging.
ai-sdk/
β
βββ ai_lib/ β Internal SDK (core library)
β βββ client.py
β βββ summarisation/
β βββ tracing/
β βββ tests/
β βββ ...
β
βββ service/ β FastAPI microservice using lib
β βββ app/ β Service code
β βββ k8s/ β Kubernetes manifests
β βββ DockerFile β Container definition
βββ terraform/ β Infrastructure as Code
βββ eval/ β DeepEval test suite
βββ pyproject.toml
βββ README.md
- Unified API:
AIClient.summarise_text()abstraction. - Backend Agnostic: Select backends via env vars (
simple,hf). - Observability: Built-in Langfuse tracing via
traced_operation(). - Extensible: Designed for clean architectural extension.
- FastAPI Integration: Shows how downstream teams consume the SDK.
- Production Ready: Deployable via Docker or Kubernetes.
- Unit Tests: Located in
tests/. - LLM Evals: DeepEval test suite in
eval/. - CI/CD: GitHub Actions pipeline included.
- Containerisation: Dockerfile included.
- Orchestration: Kubernetes Deployment + Service manifests.
- IaC: Terraform stub for namespace provisioning.
- GitOps: Structure ready for Argo CD.
For detailed setup instructions, see docs/GETTING_STARTED.md
# 1. Clone and set up
git clone <repository-url>
cd ai-sdk
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
make install
# 2. Run the service
make service-run
# 3. Test it (in another terminal)
make service-testmake install
# Or manually:
pip install -e ".[dev]"
pip install -r service/requirements.txtmake service-run
# Or manually:
uvicorn service.app.main:app --reload --host 0.0.0.0 --port 8000Health Check:
curl http://localhost:8000/healthSummarise Text:
curl -X POST http://localhost:8000/summarise \
-H "Content-Type: application/json" \
-d '{"text": "Bella is a 3-year-old indoor cat..."}'Build the image:
docker build -f service/DockerFile -t ai-summarise-service .Run the container:
docker run -p 8000:8000 \
-e AI_LIB_SUMMARISATION_BACKEND=simple \
ai-summarise-serviceTo enable tracing, set the following environment variables:
export LANGFUSE_SECRET_KEY=...
export LANGFUSE_PUBLIC_KEY=...
export LANGFUSE_HOST=[https://cloud.langfuse.com](https://cloud.langfuse.com)Any SDK call wrapped with the tracer will appear in your Langfuse dashboard:
with traced_operation("summarise_text", inputs={"text": text}):
# ... logicUnit Tests:
pytest testsDeepEval Tests:
pytest evalBoth test suites run automatically in the GitHub Actions pipeline.
The pipeline replicates how production AI teams enforce quality:
- Installs dependencies.
- Runs unit tests.
- Runs DeepEval tests (supports Langfuse-enabled evals).
Prerequisites:
- Minikube installed and running
- kubectl configured
Quick Start:
# 1. Create namespace with Terraform
cd terraform && terraform apply
# 2. Build image in Minikube's Docker
eval $(minikube docker-env)
docker build -f service/DockerFile -t ai-summarise-service:latest .
# 3. Create secrets (if using Langfuse)
kubectl create secret generic langfuse-secrets \
--from-literal=secret-key='your-key' \
--from-literal=public-key='your-key' \
-n ai-platform
# 4. Deploy to Kubernetes
kubectl apply -f service/k8s/deployment.yml
kubectl apply -f service/k8s/service.yml
# 5. Access the service
kubectl port-forward service/ai-summarise-service 8000:80 -n ai-platformInfrastructure as Code (IaC) for managing Kubernetes infrastructure.
Quick Start:
cd terraform
terraform init
terraform plan
terraform applyCreates the ai-platform namespace.
This repository demonstrates the full lifecycle of AI platform engineering:
- Shared internal SDK
- Evaluation & Observability
- Microservice Integration
- Docker Containerisation
- CI/CD Automation
- Infra + GitOps Deployment Patterns
It is intended as a hands-on, end-to-end example of what a modern AI platform looks like.
- Publish
ai_libto a private PyPI registry. - Add Argo CD Application manifest.