VitalLens is a mobile-first wellness check-in that uses phone sensors to estimate pulse, capture breath motion, and generate a wellness-only AI summary with IBM watsonx.
https://vitallens-iota.vercel.app
https://github.com/arliking13/vitallens
The GitHub repository should be used as the source code reference because the zip file may be too large to upload directly.
- Pulse check: estimates pulse from a finger-camera signal using the rear camera and torch when supported.
- Breath motion check: records small phone movements while the phone rests against the chest or upper abdomen.
- Report screen: combines local Pulse and Breath results into one wellness-only check-in.
- IBM watsonx summary: sends structured results plus compact telemetry to a server-side route for a concise AI summary.
- Safe fallback: keeps local results visible and returns a simple fallback if IBM watsonx is unavailable.
VitalLens explores how ordinary phone sensors can support a quick, accessible wellness check-in without extra hardware. The goal is to make the experience simple on mobile: a rear-camera pulse estimate, an audio-guided breath motion check, and a short report that helps users understand session quality. It is intentionally wellness-only and avoids medical claims.
- Next.js
- React
- TypeScript
- Vercel
- IBM Cloud
- IBM watsonx
- DeviceMotion API
- MediaDevices API / Camera API
- Web Speech API
- CSS
Flow:
- User completes Pulse Check -> result saved in app state.
- User completes Breath Check -> result saved in app state.
- Report screen combines both results.
- Report sends structured data plus compact telemetry to
/api/wellness-report. - API route calls IBM watsonx.
- If IBM fails, local results remain visible and a fallback summary is shown.
Key folders and files:
src/app-shell/AppShell.tsx: top-level check flow and shared result state.src/features/pulse/: Pulse camera UI, PPG sampling, signal quality, and pulse estimate flow.src/features/breath/: Breath motion check, voice guidance, motion waveform, and result UI.src/features/report/: Local result cards and IBM summary UI.src/app/api/wellness-report/route.ts: server-side IBM watsonx integration.src/shared/types/check-flow.ts: shared flow, result, and report payload types.
The Pulse Check uses the rear camera stream and the torch/flash when supported by the browser and device. The user covers the rear camera with a finger, and VitalLens samples the finger-camera signal to estimate pulse. The UI tracks signal quality, confidence, and clean sample duration.
The pulse value is a wellness-only estimate. VitalLens does not claim clinical accuracy and is not intended for medical decisions.
The Breath Motion Check uses the DeviceMotion API while the phone rests against the chest or upper abdomen. Voice guidance helps because the user may not be watching the screen during the check.
During recording, the waveform is generated from motion telemetry. The result includes:
- Motion: detected or low
- Rhythm: steady, uneven, or not enough motion
- Quality: good, fair, or low
- Sample duration
The IBM summary is generated through a server-side Next.js API route. The route sends final Pulse and Breath results plus compact telemetry, including downsampled traces and basic signal statistics.
The prompt asks watsonx to interpret:
- Pulse signal reliability
- Breath motion consistency
- Overall session quality
- What may have reduced confidence
- One practical way to repeat the check more cleanly
The prompt also keeps the output wellness-only. It avoids diagnosis, treatment advice, medical claims, and normal/abnormal labels. If IBM watsonx is not configured or unavailable, the app returns a safe fallback while keeping local results visible.
Create .env.local for local development:
IBM_WATSONX_API_KEY=
IBM_WATSONX_URL=https://ca-tor.ml.cloud.ibm.com
IBM_WATSONX_MODEL_ID=mistralai/mistral-small-3-1-24b-instruct-2503
IBM_WATSONX_PROJECT_ID=
IBM_WATSONX_VERSION=2024-03-14Do not commit .env.local. Use Vercel Environment Variables for deployment.
Install dependencies:
npm installRun locally on Windows:
npm.cmd run devValidate on Windows:
npm.cmd run lint
npm.cmd run buildRun validation on macOS/Linux:
npm run lint
npm run build- Create an IBM Cloud account.
- Create or use a watsonx.ai project.
- Create an IBM Cloud API key or Service ID API key.
- Copy the Project ID from the watsonx.ai project.
- Find an available instruct model for the selected region.
- Add the env vars to
.env.localand Vercel. - Redeploy after changing env vars.
For the current Toronto endpoint, the working model used is:
mistralai/mistral-small-3-1-24b-instruct-2503
VitalLens is deployed on Vercel.
- Connect the GitHub repository to Vercel.
- Add IBM watsonx variables in Vercel Project Settings -> Environment Variables.
- Redeploy after environment variable changes.
- Open the app on a mobile browser for the best demo experience.
- VitalLens is wellness-only and is not a medical device.
- Camera and motion permissions vary by browser and device.
- Torch support depends on the device and browser.
- iPhone Safari requires a user gesture for camera access.
- DeviceMotion access may require explicit permission on iOS.
- IBM summary generation depends on valid env vars, endpoint availability, and model availability.
- Open the app on mobile.
- Start Pulse Check.
- Cover the rear camera with a finger.
- Complete the pulse estimate.
- Start Breath Motion Check.
- Place the phone against the chest or upper abdomen.
- Follow voice guidance.
- Open Report.
- Generate the IBM summary.
- Downloadable report image
- Broader device testing
- Better signal quality guidance
- Richer IBM report interpretation
- Optional history/comparison snapshots
VitalLens is a wellness-only project and is not intended for diagnosis, treatment, or medical decision-making.