Tend your family’s investments like an allotment. Harvest smarter wealth.
AllotMint is a private, server-less web app that turns real-world family investing into a visually engaging “allotment” you tend over time. It enforces strict compliance rules (30‑day minimum holding, 20 trades/person/month), runs entirely on AWS S3 + Lambda, and keeps your AWS and Python skills sharp.
For setup and usage instructions, see the USER_README.
For contributor workflow guidance, start with the Repository Guidelines.
- Portfolio Viewer – individual / adults / whole family.
- Compliance Engine – 30‑day sell lock & monthly trade counter.
- Stock Screener v1 – PEG < 1, P/E < 20, low D/E, positive FCF.
- Scenario Tester (Lite) – single‑asset price shocks.
- Lucy DB Pension Forecast – inflation‑linked income overlay.
| Layer | Choice |
|---|---|
| Frontend | React + TypeScript → S3 + CloudFront |
| Backend | AWS Lambda (Python 3.12) behind API Gateway |
| Storage | S3 JSON / CSV (no RDBMS) |
| IaC | AWS CDK (Py) |
The backend, CI/CD workflows, and tests all target Python 3.12.
The repo includes a lightweight Yahoo Finance watchlist. Run it locally with:
# backend
uvicorn app:app --reload --port 8000 --host 0.0.0.0
# frontend
npm i && npm run dev
Simulate historical shocks against your portfolio. apply_historical_event
calculates holding returns over several horizons from the event date. When a
holding lacks data for a given horizon, returns fall back to the event's
proxy_index via load_meta_timeseries_range.
The backend can parse transaction exports from supported providers. Upload a file and specify the provider name:
curl -F provider=degiro -F file=@transactions.csv \
http://localhost:8000/transactions/import
For convenience, use the helper script:
python scripts/import_transactions.py degiro path/to/transactions.csv
Runtime Python dependencies live in requirements.txt. Development tooling
(CDK, Playwright, moviepy, etc.) is listed in requirements-dev.txt. Install
both when working locally:
pip install -r requirements.txt -r requirements-dev.txtWorkflows and helper scripts install from these files, so update them when new packages are needed.
Format Python code with Black and lint it with Ruff. After installing the development dependencies, run:
black .
ruff check .black rewrites files in-place, while ruff check reports lint issues (use
ruff check --fix to automatically apply safe fixes).
Install the Git hooks so code style checks and tests run automatically before each commit:
pre-commit installThe configured hooks format Python code with Black, lint with Ruff, and execute the pytest suite.
Sensitive settings are loaded from environment variables rather than
config.yaml. Create a .env file with the helper script:
python scripts/setup_env.pyThis copies .env.example to .env and prompts for required values such as
ALPHA_VANTAGE_KEY and JWT_SECRET. You can then edit .env to set optional
values like:
SNS_TOPIC_ARN=arn:aws:sns:us-east-1:123456789012:allotmint # optional
TELEGRAM_BOT_TOKEN=123456789:ABCDEFGHIJKLMNOPQRSTUVWXYZ # optional
TELEGRAM_CHAT_ID=123456789 # optional
DATA_ROOT=./data # base directory for application data
GOOGLE_AUTH_ENABLED=true # enable Google sign-in
GOOGLE_CLIENT_ID=your-client-id.apps.googleusercontent.com # Google OAuth client (required when enabling)
DATA_ROOT overrides the paths.data_root value in config.yaml, allowing the
backend to load data from a different directory.
If GOOGLE_AUTH_ENABLED is true, you must create an OAuth 2.0 Client ID in the Google Cloud Console and supply it via the GOOGLE_CLIENT_ID environment variable or the google_client_id entry in config.yaml.
Alternatively export variables in your shell. Unset variables simply disable their corresponding integrations.
When GOOGLE_AUTH_ENABLED is true, the backend requires a valid
GOOGLE_CLIENT_ID. Create an OAuth client in the
Google Cloud Console, then
set the client ID via the GOOGLE_CLIENT_ID environment variable or
config.yaml. With both enabled, the backend accepts Google ID tokens at
/token/google and only issues API tokens for emails discovered in the
accounts directory.
Local development typically leaves auth.disable_auth set to true in
config.yaml, meaning requests run without authentication. In that mode the
API loads demo data for the owner defined by auth.demo_identity (default:
steve). Adjust the value to point at whichever demo account exists under your
data/ directory when you want to explore different fixtures. Automated smoke
checks look at auth.smoke_identity instead so that they can target a
preconfigured dataset without disturbing the local default.
When you want the API and UI to behave as though a specific user is logged in
while authentication is disabled, set auth.local_login_email (or export
LOCAL_LOGIN_EMAIL). The value should be the email address you want the app to
assume for local sessions; the front end mirrors that value so profile- and
owner-scoped features resolve correctly without issuing a token.
Production or AWS deployments should flip auth.disable_auth to false and
enable Google sign-in. Once authentication is enforced the configured user from
the incoming token is used instead, so the demo_identity value is ignored
except as a fallback for unauthenticated requests (for example, smoke tests that
explicitly disable auth).
Web push uses VAPID credentials which can be generated and persisted to AWS
Parameter Store via scripts/setup_vapid_keys.py. The script creates public and
private key parameters (/allotmint/vapid/public and
/allotmint/vapid/private by default) when they are missing.
User alert thresholds and web push subscriptions persist to a tiny JSON object whose location is configurable via environment variables. Each URI may target a local file, an S3 object or an AWS Systems Manager Parameter Store entry.
| Variable | Description | Default |
|---|---|---|
ALERT_THRESHOLDS_URI |
Location of threshold configuration. | file://data/alert_thresholds.json |
PUSH_SUBSCRIPTIONS_URI |
Storage for web push subscriptions. | file://data/push_subscriptions.json |
URIs may use the following schemes:
file:///path/to/file.json– read/write a local file (useful for tests).s3://bucket/key.json– store JSON in an S3 object.ssm://parameter-name– store JSON in AWS Parameter Store.
When using AWS backends the executing role must allow:
- S3 –
s3:GetObjectands3:PutObjecton the referenced object. - Parameter Store –
ssm:GetParameterandssm:PutParameteron the parameter name.
For a more app-like experience on iOS and Android consider building thin wrappers with Capacitor or similar tooling. This simplifies installation and enables more reliable notifications on mobile devices.
npm install
npm run mobile:add # generates android/ and ios/ projects under mobile/The React frontend is bundled and copied into the native shells with:
npm run mobile:android # or: npm run mobile:iosThe build reuses the VITE_VAPID_PUBLIC_KEY from your .env so web push
notifications behave the same inside the Capacitor web view. Native wrappers
use platform channels (Firebase Cloud Messaging on Android and Apple Push
Notification service on iOS) which deliver more reliably than browser-based
VAPID push.
Expensive API routes cache their JSON responses under data/cache/<page>.json.
Each request serves the cached payload when it is fresh; on a miss or stale
entry the response is rebuilt, returned to the client and saved in the
background. A lightweight scheduler keeps caches warm by rebuilding them at a
fixed interval.
| Page | TTL (seconds) |
|---|---|
| Portfolio views | 300 |
| Screener queries | 900 |
Adjust the PORTFOLIO_TTL and SCREENER_TTL constants in
backend/routes/portfolio.py and backend/routes/screener.py to change these
intervals. The cache helpers live in backend/utils/page_cache.py.
FX conversions use daily GBP rates. When offline_mode: true in
config.yaml, rates are loaded from parquet files under
data/timeseries/fx/<CCY>.parquet. Each file must contain Date and Rate
columns. Populate the cache before going offline:
python - <<'PY'
from datetime import date
import pandas as pd
from backend.utils.fx_rates import fetch_fx_rate_range
df = fetch_fx_rate_range("USD", "GBP", date(2024,1,1), date.today())
df.to_parquet("data/timeseries/fx/USD.parquet", index=False)
PYIf a currency file is missing, _convert_to_base_currency falls back to requesting
rates from fx_proxy_url configured in config.yaml.
The backend exposes Value at Risk (VaR) metrics for each portfolio.
- Defaults – 95 % confidence over a 1‑day horizon and 99 % over 10 days.
- Query –
GET /var/{owner}?days=30&confidence=0.99fetches a 30‑day, 99 % VaR. - UI – VaR surfaces alongside portfolio charts on the performance dashboard.
Assumptions
- Historical simulation using daily returns from cached price series.
- Results reported in GBP.
- Calculations default to a 365‑day window (
daysparameter).
See backend/common/portfolio_utils.py for the return series that feed the calculation and backend/common/constants.py for currency labels, and the VaR documentation for methodology.
GET /reports/{owner} compiles realized gains, income and performance metrics
for a portfolio. Pass format=csv or format=pdf to download the report in
your preferred format.
The project is split into a Python FastAPI backend and a React/TypeScript
frontend. The two communicate over HTTP which makes it easy to work on either
side in isolation. Backend runtime options are stored in config.yaml:
app_env: local
uvicorn_host: 0.0.0.0
uvicorn_port: 8000
reload: true
log_config: backend/logging.iniAdjust these values to change the environment or server behaviour.
Environment-specific CORS whitelists are defined in the same file:
cors:
local:
- http://localhost:3000
production:
- https://app.allotmint.ioThe list matching app_env is applied to the backend's CORS middleware.
Optional frontend tabs can be toggled in config.yaml under ui.tabs:
ui:
tabs:
instrument: true
performance: true
transactions: true
screener: true
trading: true
timeseries: true
watchlist: true
virtual: true
reports: true
support: trueSetting a tab to false removes its menu entry and related links from the UI.
# clone & enter
git clone git@github.com:leonarduk/allotmint.git
cd allotmint
# set up Python venv for CDK & backend (optional)
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt -r requirements-dev.txt
# configure API settings
# (see config.yaml for app_env, uvicorn_host, uvicorn_port, reload and log_config)
./run-local-api.sh # or use run-backend.ps1 on Windows
# in another shell install React deps and start Vite on :5173
cd frontend
npm install
npm run dev
# visit the app
open http://localhost:5173/- Authentication:
- Set
API_TOKENand include it as anX-API-Tokenheader in requests, or - Leave
API_TOKENunset to disable authentication during local development. - A previous username/password flow using
OAuth2PasswordRequestFormhas been removed. Reintroduce it manually if needed by recreating a login endpoint that validates credentials and issues tokens withcreate_access_token. - When
disable_authistrue(the default inconfig.yaml) the discovery helpers expose every owner locally, regardless of whether account data is read from the filesystem or an S3 bucket. - In production environments set
disable_authtofalseand supply anX-API-Token(or other configured authentication). S3 discovery will then return only the current user's accounts or any owners that list the user as a viewer, and unauthenticated requests receive an empty list. See the Authentication section for details.
- Set
Trading alerts support multiple transports that are enabled via environment variables:
- AWS SNS – set
SNS_TOPIC_ARNto publish alerts to an SNS topic usingbackend.common.alerts.publish_alert. - Telegram – provide
TELEGRAM_BOT_TOKENandTELEGRAM_CHAT_IDto forward alerts to a Telegram chat viabackend.utils.telegram_utils.
When several transports are configured, alerts are sent to each of them.
Account and instrument files live in a separate data repository. Clone or sync it alongside the application before running the backend:
# clone once
git clone git@github.com:your-org/allotmint-data.git data
# pull updates
cd data && git pullPoint the backend at the checked-out data by setting DATA_ROOT (or
accounts_root in config.yaml) to the directory path:
# .env or shell
DATA_ROOT=$(pwd)/dataThe backend uses this folder when config.app_env: local or when
DATA_BUCKET is unset. Commit changes to update the data set:
All runtime data now resides in an external S3 bucket instead of the repository. Sync the contents before starting the backend:
aws s3 sync s3://$DATA_BUCKET/ data/When running the backend in AWS (config.app_env: aws), account and
metadata JSON files are loaded from this bucket.
The local startup scripts (run-local-api.sh and run-backend.ps1) perform
this sync automatically when DATA_BUCKET is set.
cd data
git add accounts/alice/trades.csv
git commit -m "Update Alice trades"
git pushWhen running in AWS (config.app_env: aws), account and metadata JSON files
are loaded from S3. Set the following environment variables:
DATA_BUCKET=allotmint-prod-data
METADATA_BUCKET=allotmint-metadata
METADATA_PREFIX=instruments/The Lambda execution role requires:
s3:ListBucket(with a prefix ofaccounts/) – discover available accounts.s3:GetObjectonaccounts/*– read account andperson.jsonfiles.s3:PutObject/s3:DeleteObjecton the same paths to push updates.
Instrument metadata resides under METADATA_PREFIX in METADATA_BUCKET.
To upload new data:
aws s3 sync data/accounts s3://$DATA_BUCKET/accounts/Uploading requires IAM permissions matching the above s3:PutObject rules.
The S3 bucket mirrors the previous data/ directory:
accounts/
cache/
events/
instruments/
metrics/
prices/
timeseries/
transactions/
alert_thresholds.json
jpm.json
scaling_overrides.json
skipped_tickers.log
virtual_portfolios/
London-listed instruments now inherit a 0.01 scale factor when their
metadata currency is GBX (pence). Update scaling_overrides.json only for
edge cases where the automatic detection is incorrect.
Run Python and frontend test suites with:
pytest
cd frontend && npm testSome API tests load sample account data from config.accounts_root (defaulting
to data/accounts). If this directory is absent, the tests are skipped.
The PY_COV_MIN environment variable lets you enforce a minimum coverage
percentage during pytest runs. Use it together with PYTEST_ADDOPTS to pass
the desired threshold to pytest:
PY_COV_MIN=80 PYTEST_ADDOPTS="--cov-fail-under=$PY_COV_MIN" pytestAn optional error_summary section in config.yaml stores settings for the
run_with_error_summary.py utility. When the field is missing the backend falls
back to an empty mapping so the script can still be used with explicit
arguments. You can capture error lines by running the helper which writes them
to error_summary.log. Optionally set a default command in
config.yaml under error_summary.default_command so the script can run
without CLI arguments:
error_summary:
default_command: ["pytest"]Running python run_with_error_summary.py with no arguments will then use the
configured default.
# example
python run_with_error_summary.py pytestRetrieve trade signals through the API:
curl http://localhost:8000/trading-agent/signalsIn production, the price_refresh Lambda can invoke the agent after updating
prices.
The project includes an AWS CDK stack that provisions an S3 bucket and CloudFront distribution for the frontend. To deploy the site:
# build the frontend assets first
cd frontend
npm install
npm run build
cd ..
# deploy the static site stack only
cd cdk
cdk bootstrap # only required once per AWS account/region
DEPLOY_BACKEND=false cdk deploy StaticSiteStack
# or include the backend Lambda stack
DEPLOY_BACKEND=true cdk deploy BackendLambdaStack StaticSiteStack
# equivalently: cdk deploy BackendLambdaStack StaticSiteStack -c deploy_backend=trueThe bucket remains private and CloudFront uses an origin access identity with Price Class 100 to minimise cost while serving the content over HTTPS.
Runtime portfolio data lives in a separate S3 bucket referenced by the
DATA_BUCKET environment variable. Files are stored under the accounts/
prefix:
accounts/<owner>/trades.csvaccounts/<owner>/<account>.jsonaccounts/<owner>/person.json- Instrument metadata files under
METADATA_PREFIX(defaultinstruments/) in the bucket specified byMETADATA_BUCKET.
Lambdas that read portfolio information require s3:GetObject permission for
these paths. A minimal IAM policy statement is:
{
"Effect": "Allow",
"Action": ["s3:GetObject"],
"Resource": "arn:aws:s3:::YOUR_DATA_BUCKET/accounts/*"
}The CI workflow in .github/workflows/deploy-lambda.yml uses GitHub's
OpenID Connect (OIDC) provider to assume an IAM role at deploy time. To
enable this:
-
Create an IAM role with permissions to deploy the CDK stack.
-
Add a trust policy that allows the GitHub OIDC provider to assume the role. A minimal example is:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": {"Federated": "arn:aws:iam::YOUR_ACCOUNT_ID:oidc-provider/token.actions.githubusercontent.com"}, "Action": "sts:AssumeRoleWithWebIdentity", "Condition": { "StringEquals": { "token.actions.githubusercontent.com:sub": "repo:YOUR_ORG/YOUR_REPO:ref:refs/heads/main" } } } ] } -
Store the role ARN in the repository as the
AWS_ROLE_TO_ASSUMEsecret and setAWS_REGIONas needed.
Remove any long-term AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
secrets, as they are no longer required.
Install the video dependencies first. The requirements-dev.txt file includes
moviepy and gTTS:
pip install -r requirements.txt -r requirements-dev.txtSave an image named presenter.png in the scripts directory, then run:
python scripts/make_allotmint_video.pyThe script will produce allotmint_video.mp4 in the repository root.
This project is licensed under the MIT License. See LICENSE for details.
If you're developing on Windows use the following equivalents. These examples target
cmd.exe and PowerShell where noted.
-
Create and activate a Python virtual environment:
- cmd.exe:
python -m venv .venv .venv\Scripts\activate pip install -r requirements.txt -r requirements-dev.txt
- PowerShell (may require changing execution policy to run
Activate.ps1):
python -m venv .venv .\.venv\Scripts\Activate.ps1 pip install -r requirements.txt -r requirements-dev.txt
-
Start the backend using the provided PowerShell helper (recommended on Windows):
.\run-backend.ps1 # or run the Unix-style helper if using WSL: ./run-local-api.sh
-
From another terminal start the frontend (PowerShell/cmd equivalents):
- cmd.exe / PowerShell:
cd frontend npm install npm run dev
-
Open the UI in your browser:
- cmd.exe:
start http://localhost:5173/- PowerShell:
Start-Process http://localhost:5173/
-
Setting environment variables (examples):
- cmd.exe:
set VITE_ALLOTMINT_API_BASE=https://api.example.com npm run dev
- PowerShell:
$env:VITE_ALLOTMINT_API_BASE = 'https://api.example.com' npm run dev
Notes:
- Some shell examples in this README use POSIX syntax (for macOS/Linux). On Windows you can use Command Prompt, PowerShell, or WSL; the
run-backend.ps1helper is provided for native PowerShell usage. - The
DATA_ROOT=$(pwd)/dataand similar POSIX-style snippets earlier in this document assume a Unix-like shell; on Windows replace$(pwd)with%cd%(cmd) or(Get-Location).Path(PowerShell), or use WSL for the POSIX forms.