Local development stack for OLT: a boilersync Django/React wrapper app plus self-hosted learning tools behind one local Nginx router.
Add these names to /etc/hosts:
127.0.0.1 olt.localhost auth.localhost cryptpad.localhost cryptpad-sandbox.localhost peertube.localhost scholarsome.localhost h5p.localhost code.localhost chat.localhost lrs.localhostMost browsers already resolve *.localhost, but the hosts entry keeps CLI tools and older resolvers predictable.
Create local secrets from the example and start the stack:
cp .env.example .env
docker compose up -d --buildThe Django API is built from this Multi workspace using api-core plus the generated api/ package. Project Django settings live in api/open_learning_tools_api/settings.py; api-core is left as the shared runtime.
- Wrapper:
http://olt.localhost - Admin / identity provider:
http://auth.localhost/admin/ - Docs:
http://cryptpad.localhost - Videos:
http://peertube.localhost - Flashcards:
http://scholarsome.localhost - Quizzes:
http://h5p.localhost - Code:
http://code.localhost - AI Chat:
http://chat.localhost - LRS:
http://lrs.localhost
The Django backend enables django-oauth-toolkit OIDC at:
http://olt.localhost/.well-known/openid-configuration
http://olt.localhost/o/.well-known/jwks.jsonOn backend startup, local OAuth clients are bootstrapped for Videos, AI Chat, Docs proxy auth, Flashcards proxy auth, and Code proxy auth using values from .env. The Videos container also has a bootstrap helper that installs/configures its OpenID Connect plugin from those same values.
Some services still need their own app-level setup after the containers are reachable:
- Videos: OIDC is bootstrapped automatically; complete the first-run profile/admin prompts when they appear.
- AI Chat: OIDC is configured through
.env; model/provider keys are loaded from an untracked local provider env file, andLIBRECHAT_ENDPOINTSdefaults toopenAI,agents. - Docs: protected by
oauth2-proxyoncryptpad.localhostand configured with the local OIDC SSO plugin;cryptpad-sandbox.localhostremains reachable as the required sandbox origin. - Flashcards and Code: protected by
oauth2-proxy; deeper in-app user mapping can be added later. - Quizzes: add real H5P content packages to the lightweight host.
- LRS: local-dev xAPI forwarding is wired through the ingest bridge described below.
Ralph remains available directly with Basic Auth from .env:
http://lrs.localhost/xAPI/statementsServices should normally send statements through the local ingest bridge instead of using Ralph credentials directly:
Browser/client code: http://lrs.localhost/ingest/xapi/statements
Container/server code: http://xapi-ingest:8090/xapi/statementsThe bridge forwards to Ralph server-side, so sub-service browser code never
needs the LRS password. Compose passes these URLs to services as
OLT_XAPI_PUBLIC_INGEST_URL, OLT_XAPI_INTERNAL_INGEST_URL, and
OLT_XAPI_ACTIVITY_PREFIX.
Current local-dev coverage:
- Videos: PeerTube plugin emits server video upload/view events and browser playback events.
- Docs: CryptPad customization emits best-effort page/app visit events.
- Quizzes: H5P host forwards H5P xAPI statements from the H5P dispatcher.
- Flashcards: Nginx injects a browser route-visit forwarder for Scholarsome.
- Code: code-server emits a container startup activity through its init hook.
- AI Chat: LibreChat preload emits session/chat/conversation activity.
For local verification:
curl -u "$RALPH_LRS_USERNAME:$RALPH_LRS_PASSWORD" \
http://lrs.localhost/xAPI/statements