A menu bar app that turns freeform notes into actionable checklists using a local AI model (Ollama + Qwen). Lives in your menu bar, no cloud required. Works on macOS and Windows.
- macOS (Apple Silicon or Intel) or Windows 10/11
- Node.js v18 or later — download here (only needed for development)
- 16 GB RAM recommended (the Qwen 3 8B model uses ~8 GB)
- Download the latest release for your platform (
.dmgfor macOS,.exeinstaller for Windows) - Open the app — a setup wizard will walk you through installing Ollama and downloading the AI model
- Start using it
git clone https://github.com/isaacHuh/simple_notes.git
cd simple_notes
npm install
npm startOn first launch, the setup wizard will guide you through any remaining setup (installing Ollama, pulling the model). You can also set things up manually — see below.
When you first open SimpleNotes (or if Ollama/the model becomes unavailable), a guided setup wizard appears:
Step 1 — Ollama installed & running
- The app checks if the Ollama server is reachable at
localhost:11434 - If not found, click "Download Ollama" to open the correct download page for your platform
- After installing, open the Ollama app and click "Check again"
Step 2 — Model ready
- The app checks if the configured model (default:
qwen3:8b) is downloaded - If not, click "Pull Model" to download it directly from the app with a live progress bar
- The download is ~4.7 GB
Once both steps show green checkmarks, click "Get Started".
You can also click "Skip setup" if you prefer to configure things manually or use a different model.
If you prefer to set up Ollama yourself instead of using the wizard:
macOS:
brew install ollamaOr download from ollama.com/download/mac.
Windows: Download the installer from ollama.com/download/windows.
macOS: Open the Ollama app, or run in a terminal:
ollama serveWindows: Open Ollama from the Start menu. The server starts automatically.
The server runs on http://localhost:11434.
ollama pull qwen3:8bollama run qwen3:8b "List 3 things to do before a road trip"- Type a note in the input bar at the bottom — anything like "Meeting notes: discuss Q3 budget, review hiring pipeline, schedule team offsite"
- Press Enter — the note is sent to your local model, which extracts actionable checklist items
- Check off items as you complete them — they move to the "Completed" section
- Drag and drop one task onto another to merge them with AI
- Click + on any task to add context or sub-tasks
- Right-click any item to delete it
- Green dot in the title bar = Ollama is connected. Red dot = Ollama is not reachable.
Your checklist is saved automatically and persists across restarts.
- macOS:
~/Library/Application Support/NoteFlow/data.json - Windows:
%APPDATA%/NoteFlow/data.json
Red status dot / "Ollama is not running"
- Make sure Ollama is open (macOS: check menu bar; Windows: check system tray)
- Or run
ollama servein a terminal - Check that nothing else is using port 11434
"Model not found"
- Run
ollama pull qwen3:8bto download the model - Or use the setup wizard (it will appear automatically)
- Verify with
ollama list
Slow responses
- Close memory-heavy apps (browsers with many tabs, Docker, etc.) to free up RAM
- Check Activity Monitor (macOS) or Task Manager (Windows) — memory usage should have headroom
App doesn't appear in menu bar / system tray
- macOS: Look for the icon in the top-right area of your menu bar. On Macs with a notch, try
Cmd+dragto rearrange icons - Windows: Check the system tray (bottom-right, click the ^ arrow if hidden)
macOS (produces .dmg and .zip):
npm run buildWindows (produces NSIS installer and .zip):
npm run build:winBoth platforms:
npm run build:allOutput goes to the dist/ directory.
- Electron + menubar — tray icon and floating panel
- Ollama — local AI inference server
- Qwen 3 8B — language model for task extraction
- electron-builder — packaging for macOS (DMG) and Windows (NSIS)
- Vanilla HTML/CSS/JS — no frontend framework
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
This project is licensed under the MIT License.