- Docker and Docker Compose installed
- Clone this repository:
git clone https://github.com/akashjaura16/sleepTrackerApp.git
cd sleepTrackerApp
- Build and start the app and database:
docker compose up --build - Access the app:
- Open http://localhost:3000/api/student
- You should see:
{ "name": "akashdeep singh", "studentId": "224911605" }
- All required config is provided via Docker Compose.
- No secrets are hard-coded.
- If you use any cloud services, provide instructions for the marker to obtain credentials (e.g., via OnTrack).
- All database-backed features (signup, login, etc.) work out-of-the-box.
- Marker does not need to modify any code or guess missing info.
For containerization, I used Docker Compose to run both the Node.js app and MongoDB together. This ensures the app works in any environment and the database is always available. I set environment variables in the compose file to avoid hard-coding secrets. The most challenging part was troubleshooting database connectivity between containers, which I resolved by using service names in the MongoDB URI. I found it rewarding to see the app run end-to-end in Docker, knowing it’s fully portable and reproducible. I tested all features in a clean environment to ensure reliability.
The Alive Sleep Tracker App is a Node.js and Express-based web application that provides a foundation for tracking and managing sleep-related data. The application follows a clean, modular architecture and uses environment variables to manage configuration across environments securely.
- Technologies Used
- Project Features
- Project Structure
- Application Architecture
- Templates
- Mongodb Database Integration
- Secure Environment Variable Management
- Error Handling
- Testing
- Static Assets
- Auth0 Integration
- Contentful Integration
- OpenAI Integration
- Environment Variables
- How to Run
- Version Control Practices
- Node.js (Run-time environment)
- Express.js (Web framework)
- Nodemon (Development server auto-reloading)
- MongoDB (Database)
- Mongoose (MongoDB object modelling)
- EJS (Embedded JavaScript templating)
- dotenv (Configuration management)
- express-openid-connect ( Authentication)
- Mocha (Test framework)
- Chai (Assertion library)
- Supertest (HTTP assertion library)
- Sinon (Mock/stub library)
- Proxyquire (Testing module replacement library)
- Playwright (End-to-end browser testing framework)
- Express.js server with a modular and maintainable structure
- Server-side rendering using EJS templates
- MongoDB database integration
- Secure environment variable management
- Centralised error handling (404 and 500 error pages)
- Centralised unit and integration testing
- Static asset handling for CSS and JavaScript
- Auth0 authentication integration for user login and registration
The codebase follows MVC-aligned architecture to keep responsibilities separated
and maintainability easy. Public assets such as CSS, JavaScript, and images live
in the public directory so they can be served directly without touching
application code. Core application logic is organised under src, where
controllers, helpers, models, routes, and views sit in their own folders, making
it simple to find and modify related functionality. Automated tests reside in
tests, grouped into helper utilities and integration flows, so quality checks
stay close to the code they validate. This separation keeps changes isolated,
improves onboarding for new contributors, and lets teams iterate on features
without stepping on each other’s work.
public/
├── img/ # Image assets
├── css/ # Stylesheets
└── js/ # JavaScript files
src/
├── controllers/ # Application logic
├── helpers/ # Utility functions (DB, Auth0, config)
├── models/ # Database schemas
├── routes/ # Application routes
├── services/ # Domain-specific services
├── views/ # EJS templates and components
├── app.js # Express app factory
└── server.js # Server bootstrap
tests/
├── helpers/ # Test utilities
├── integration/ # Integration test suites
│ ├── flows/ # Integration flow tests
│ └── pages/ # Page rendering tests
├── unit/ # Unit test suites
├── smoke/ # Quick health check tests
└── e2e/ # Browser-based end-to-end tests
api/ # Serverless API function for Vercel
docs/ # Supporting documentation
.github/ # GitHub actions
.env.example # Example environment variables
.gitignore # Git ignore rules
README.md # Project documentation
package.json # Project metadata and dependencies
package-lock.json # Locked dependencies
vercel.json # Vercel deployment configuration
The following diagram shows the main elements of the application, third-party services, and their relationships:
The following diagram illustrates the object relationships and data flow within the application:
The following diagram shows the database model and relations between collections:
Source files for these diagrams (Mermaid format) are available in docs/charts/
and can be edited to reflect architectural changes.
Views are built with EJS templates. Layout components such as headers,
navigation, and footers are defined in src/views/components, while page-level
templates are defined in src/views/pages. Templates receive data via
res.render, and shared locals are defined where they are needed, keeping the
layout flexible.
Database connectivity is handled through helper modules in src/helpers/db.js,
which wrap Mongoose connection management. Schemas and models live in
src/models, keeping database structure separate from business logic so models
can evolve without impacting controllers or routes.
Configuration is centralised in src/helpers/settings.js, which loads values
via dotenv and exposes a frozen appConfig object. All modules—including the
AI helper and Contentful service—should read OPENAI_API_KEY,
CONTENTFUL_SPACE_ID, and CONTENTFUL_ACCESS_TOKEN from appConfig rather
than process.env. This keeps a single source of truth and consistent defaults
for local development.
The application defines 404 and 500 flows, rendering dedicated EJS templates
from src/views/pages/errors. Centralising these handlers keeps user-facing
feedback consistent while ensuring unexpected failures are logged for further
investigation.
The project uses a comprehensive testing setup with multiple libraries, each serving a specific purpose:
- Mocha: Test framework that provides the structure for organising and
running tests with
describeanditblocks - Chai: Assertion library that provides readable assertions like
expect().to.be.trueandexpect().to.equal() - Supertest: HTTP assertion library that allows testing Express routes and middleware by simulating HTTP requests
- Sinon: Mocking and stubbing library used to create spies, stubs, and mocks for isolating units under test
- Proxyquire: Module replacement library that enables mocking dependencies when requiring modules, to isolate code under test
- Playwright: End-to-end browser testing framework for simulating real user interactions in a headless browser environment
-
Unit Tests: Place in
tests/unit/matching the source structure- Example:
src/controllers/homeControllers.js→tests/unit/controllers/homeControllers.test.js
- Example:
-
Integration Tests: Place in
tests/integration/organised by feature area- Example: API tests in
tests/integration/api/, page tests intests/integration/pages/
- Example: API tests in
-
Test Structure:
const { expect } = require('chai'); const sinon = require('sinon'); const { functionToTest } = require('../../../src/path/to/module'); describe('Module name', () => { afterEach(() => { sinon.restore(); // Clean up stubs after each test }); it('should do something', () => { // Test implementation expect(result).to.equal(expected); }); });
Supertest is used for integration tests that need to make HTTP requests to your Express app:
const {expect} = require('chai');
const {buildRequest} = require('../../helpers/testServer');
describe('API endpoints', () => {
it('returns JSON response', async () => {
const response = await buildRequest().get('/api');
expect(response.status).to.equal(200);
expect(response.type).to.match(/json/);
expect(response.body).to.deep.equal({message: 'Welcome'});
});
});The buildRequest() helper from tests/helpers/testServer.js creates a test
client for the app.
Sinon is used for creating stubs, spies, and mocks to isolate code under test:
const sinon = require('sinon');
// Stub a function
const stub = sinon.stub(module, 'functionName').returns('mocked value');
// Stub console methods
sinon.stub(console, 'error');
// Stub Express response methods
const res = {
status: sinon.stub().returnsThis(),
render: sinon.stub(),
json: sinon.stub(),
};
// Verify calls
expect(res.render.calledOnceWithExactly('template', {data})).to.be.true;
// Clean up
sinon.restore();Proxyquire is used to replace dependencies when requiring modules, allowing you to inject mocks:
const proxyquire = require('proxyquire').noCallThru().noPreserveCache();
describe('Module with dependencies', () => {
it('should use mocked dependency', () => {
const mockDependency = {
someFunction: sinon.stub().returns('mocked'),
};
const moduleUnderTest = proxyquire('../../../src/path/to/module', {
'../path/to/dependency': mockDependency,
});
const result = moduleUnderTest.functionToTest();
expect(mockDependency.someFunction.calledOnce).to.be.true;
expect(result).to.equal('mocked');
});
});Key Proxyquire options:
.noCallThru(): Prevents the original module from being loaded.noPreserveCache(): Ensures fresh module loads for each test
The project uses a multi-layered testing strategy:
- Unit Tests (
tests/unit/): Test individual functions and modules in isolation using mocks/stubs - Integration Tests (
tests/integration/): Test API endpoints and page rendering with real app wiringtests/integration/flows/: Full user flows (sleep entry, goal setting)tests/integration/pages/: Page rendering and routing
- Smoke Tests (
tests/smoke/): Quick health checks for basic app functionality - E2E Tests (
tests/e2e/): Browser-based tests using Playwright for real user interactions
Run all tests:
npm testRun specific test suites:
# Unit tests only
npm run test:unit
# Integration tests only
npm run test:integration
# Smoke tests only (quick health checks)
npm run test:smoke
# End-to-end browser tests
npm run test:e2eRun specific test file:
npx mocha tests/unit/controllers/homeControllers.test.jsRun tests matching a pattern:
npm test -- --grep "Error controllers"Frontend assets (stylesheets, scripts, images) are served from the public
directory via express.static. Templates reference them using path helpers (
/css/styles.css, /js/scripts.js), allowing assets to live outside the
application logic.
Authentication flows are powered by express-openid-connect. Configuration is
read from environment variables via appConfig, and middleware is instantiated
in src/helpers/auth.js.
The application follows a privacy-first approach:
- No private information stored: User email addresses, names, or other personal data from Auth0 are never persisted to the database
- Hashed identifiers: The Auth0 user identifier (
sub) is hashed using HMAC-SHA256 with a secret key before storage - Session-based data: User profile information (name, email) is only available during the active session and is not saved
- Secure hashing: The
ENCRYPTION_KEYenvironment variable is used as the secret for hashing, ensuring identifiers cannot be reversed - Export Data: User data (e.g., sleep entries) may be exported by respected users to ensure complete transparency with what is collected and what the user may wish to do with their data.
- Account Deletion: User model data (e.g., sleep entries) upon account deletion via the Profile → Account Deletion will remove all user data stored within the models.
To initiate a login flow, redirect users to /auth/login:
// In a controller
res.redirect('/auth/login');
// Or with a custom return URL
res.redirect('/auth/login?returnTo=/dashboard');The login route (/auth/login) automatically redirects to Auth0's hosted login
page. After successful authentication, users are redirected back to the
application (default: /dashboard).
To log out a user, redirect to /auth/logout:
// In a controller
res.redirect('/auth/logout');
// Or in a template
<a href="/auth/logout">Sign Out</a>The logout route invalidates the Auth0 session and redirects users to the home
page (/).
Authentication checks should be handled by middleware at the route level. The application provides two middleware functions for protecting routes:
-
requireAuthRoute: Use this for page routes (HTML responses). If the user is not authenticated, it redirects them to/auth/loginwith areturnToparameter containing the original URL. If authenticated, it callsnext()to proceed to the route handler. -
requireAuthAPI: Use this for API routes (JSON responses). If the user is not authenticated, it returns a401status with a JSON error response. If authenticated, it callsnext()to proceed to the route handler.
The userSyncMiddleware automatically populates res.locals with
authentication information when user is signed in. Controllers can access user
information from res.locals when needed:
function myController(req, res) {
// Access user information from res.locals (set by middleware)
const isAuthenticated = res.locals.isAuthenticated; // Flag to check user auth status
const isFirstLogin = res.locals.isFirstLogin; // Flag to check if first login
const displayName = res.locals.displayName; // User's display name
const userProfile = res.locals.userProfile; // User profile data
const userRecord = res.locals.userRecord; // User record from DB
res.render('template', {
displayName,
isAuthenticated: true,
});
}Authentication data set by the middleware ia also available in templates:
<% if (typeof isAuthenticated !== 'undefined' && isAuthenticated) { %>
<p>Welcome, <%= displayName %>!</p>
<a href="/dashboard">My Sleep Data</a>
<a href="/auth/logout">Sign Out</a>
<% } else { %>
<a href="/auth/login">Sign In / Register</a>
<% } %>The userSyncMiddleware populates the following in res.locals:
isAuthenticated(boolean): Whether the user is currently authenticateduser(object|null): The full Auth0 user object (only during session)displayName(string|null): User's display name (name or email, fallback to null)userProfile(object|null): Non-sensitive profile withsub,email, andnameuserRecord(object|null): The database user record (contains only hashed identifier and timestamps)isFirstLogin(boolean): Whether this is the user's first login (based on timestamps)
To protect routes and require authentication, use the provided middleware functions in your route definitions:
For Page Routes (HTML responses):
Use requireAuthRoute middleware to protect page routes. Unauthenticated users
will be redirected to /auth/login with a returnTo parameter:
const express = require('express');
const {requireAuthRoute} = require('../helpers/auth');
const {renderDashboard} = require('../controllers/dashboardControllers');
const router = express.Router();
// Protected route - redirects to login if not authenticated
router.get('/dashboard', requireAuthRoute, renderDashboard);
module.exports = router;For API Routes (JSON responses):
Use requireAuthAPI middleware to protect API routes. Unauthenticated users
will receive a 401 JSON error:
const express = require('express');
const {requireAuthAPI} = require('../helpers/auth');
const {getUserData} = require('../controllers/apiControllers');
const router = express.Router();
// Protected API route - returns 401 JSON error if not authenticated
router.get('/api/user', requireAuthAPI, getUserData);
module.exports = router;Middleware Behavior:
requireAuthRoute: Redirects unauthenticated users to/auth/login?returnTo=<originalUrl>requireAuthAPI: Returns401status with JSON error:{ success: false, error: { code: 'AUTH_REQUIRED', message: 'Authentication required' } }- Both middleware functions call
next()if the user is authenticated, allowing the route handler to proceed
When a user logs in for the first time:
- Auth0 authenticates the user and returns their profile
- The
userSyncMiddlewareextracts the Auth0 identifier (sub) - The identifier is hashed using HMAC-SHA256 with
ENCRYPTION_KEY - A
Userdocument is created/updated in MongoDB with only:authIdHash: The hashed Auth0 identifierlastLoginAt: Timestamp of the last logincreatedAtandupdatedAt: Automatic timestamps
No email addresses, names, or other personal information are stored in the database.
The application uses Contentful as a Headless CMS to decouple content management from core application logic. This allows for real-time updates to educational content, sleep insights, and assets without requiring code deployments.
- Fetches sleep-related educational content and AI analysis prompts via the Contentful Delivery API.
- Centralises all images, logos, and media assets within the Contentful media library.
- Uses defined Content Types to ensure structural consistency across the articles.
To enable Contentful integration, ensure the following keys are set in your
.env file:
CONTENTFUL_SPACE_ID=replace-with-space-idCONTENTFUL_ACCESS_TOKEN=replace-with-access-token
The application uses OpenAI's Large Language Models (LLMs) to provide users with a personalised sleep health consultant and score directly in their dashboard.
- Analyses the last 7 days of sleep logs to calculate a comprehensive "Sleep Score" out of 100 based on duration and consistency.
- Generates three distinct output sections: Headline Insight, Data-Driven Analysis, and Actionable Recommendations.
- To minimise API costs, the system compares logs and goals against a MongoDB cache; new generation only occurs if data has actually changed.
To enable OpenAI integration, ensure the following token is set in your .env
file:
OPENAI_API_KEY=replace-with-openai-api-key
Create a .env file in the project root (or use system environment variables)
with the following keys:
| Variable | Example | Description |
|---|---|---|
PORT |
3000 |
Application port |
BASE_URL |
http://localhost:3000 |
Application base URL |
MONGODB_URI |
mongodb://localhost:27017/alive-sleep-tracker |
MongoDB connection string |
NODE_ENV |
development |
Node environment (development, test, production) |
ENCRYPTION_KEY |
development-only-secret-key |
Secret used to hash Auth0 identifiers |
AUTH0_ISSUER_BASE_URL |
https://dev-example.us.auth0.com |
Auth0 application domain |
AUTH0_CLIENT_ID |
replace-with-auth0-client-id |
Auth0 client ID |
AUTH0_CLIENT_SECRET |
replace-with-auth0-client-secret |
Auth0 client secret |
AUTH0_SECRET |
replace-with-auth0-session-secret |
Auth0 session secret |
OPENAI_API_KEY |
replace-with-openai-api-key |
OpenAI API key for AI-generated sleep insights |
CONTENTFUL_SPACE_ID |
replace-with-space-id |
Contentful space ID for insights/articles |
CONTENTFUL_ACCESS_TOKEN |
replace-with-access-token |
Contentful access token for CMS content |
VERCEL |
(optional) | Serverless environment flag (Vercel) |
- Clone the repository
git clone https://github.com/sleepTrackerApp/sleepTrackerApp.git
- Install dependencies
npm install
- Set up environment variables
Create a.envfile in the project root based on.env.example - Run tests (optional)
npm test - Start the development server
cd src node server.js - Open the application in a browser at
http://localhost:3000
- Sensitive configuration values should never be committed to GitHub
- Feature branches and pull requests must be used to ensure code quality and review
- Commit messages should follow a consistent, descriptive style
To ensure the ALIVE application remains consistent with our [Base UI Style Guide], we use automated linting and formatting. This keeps our code clean, prevents bugs, and ensures our design tokens—like the Primary Brand Color (#2B3990) and Midnight Background (#121212) are used correctly.
-
ESLint (eslint.config.js): Acts as our "Logic Guard." It catches errors, unused variables, and suspicious code patterns before they reach production.
-
Prettier (.prettierrc.json): Acts as our "Style Guard." It automatically handles code indentation and layout to match our 2-space standard, making the code as organized as our UI spacing scale (4px to 40px).
Before submitting your code, please run the following commands:
-
npm run format: Automatically fixes code styling (indentation, quotes, etc.).
-
npm run lint: Scans for logic errors and style guide violations.
-
npm run lint:fix: Automatically fixes any basic logic issues found by the linter.
For the best experience, install the ESLint and Prettier extensions. We recommend enabling "Format on Save" in your settings so that your code snaps to the ALIVE style guide automatically every time you save your work.
The backend now supports recording daily sleep entries with automatic duration calculation and quality mapping.
-
Endpoint: POST /recordsleep
-
Purpose: Validates and stores sleep session data in MongoDB.
-
Request Body (JSON):
{ "startTime": "YYYY-MM-DDTHH:MM:SS", "endTime": "YYYY-MM-DDTHH:MM:SS", "quality": "good | poor | missed" }
-
Duration Calculation: The server automatically calculates the hours slept based on the difference between startTime and endTime.
-
Quality Mapping: Maps string inputs (good, poor, missed) to the numerical rating required by the SleepEntry schema.
-
Error Handling: Implemented try/catch blocks to ensure a 500 Internal Server Error is returned during database failures instead of a system crash.
Tested extensively via Postman to ensure a 201 Created status and successful MongoDB document creation (verified by the return of a unique _id).
| Name | Student ID |
|---|---|
| Andrej Kudriavcev | 224939307 |
| Akashdeep Singh | 224911605 |
| Mi Vo | 224505179 |
| Naren Madabooshi Onamalai | 225281581 |
| Winston Dang | 222038631 |