A Chromium-based browser with integrated local LLM capabilities for intelligent web interaction.
- π Full-featured Chromium browser with multi-tab support
- π Tab management with keyboard shortcuts (Ctrl+T, Ctrl+W, Ctrl+Tab)
- π Navigation controls (back, forward, reload, home)
- π History tracking with searchable sidebar
- β Bookmarks management system
- π― Context menus with right-click support
- βοΈ Tab suspension for better memory management
- π Comprehensive security hardening with context isolation
- π οΈ Developer tools integration (F12)
- π Page printing and source viewing
- π Zoom controls (Ctrl +/-/0)
- πΎ Tab persistence and crash recovery
- π€ Ollama/LLM integration with streaming inference
- π¬ Chat sidebar for AI conversations with model capability detection
- β‘ Comprehensive model manager with download progress tracking
- π― Vision-capable and text-only model support
- π Automatic GPU acceleration (CUDA, ROCm, Metal)
- β Default model selection and persistent settings
- π AI Personality Selection - 26 unique personalities across 4 categories:
- Professional (Business Analyst, Technical Expert, Life Coach, etc.)
- Friends (Best Friend, Study Buddy, Workout Partner, etc.)
- Funny (Stand-up Comedian, Sarcastic Friend, Meme Lord, etc.)
- Romantic (Various caring, adventurous, and supportive partners)
- π§ Context-Aware AI - Smart context optimization with:
- Page content capture and analysis
- Browsing history context injection
- Bookmarks context injection
- Token estimation and optimization
- πΈ Vision Model Integration - Screenshot capture for vision-capable models
- π οΈ Tool Calling/Agent Mode - AI can interact with browser through 6 tools:
- Search history
- Access bookmarks
- Analyze page content
- Capture screenshots
- Get page metadata
- Perform web searches
- π€ Advanced Reasoning - Chain-of-thought support with thinking token streaming
- π Custom System Prompts - Personalize AI behavior with custom instructions
- π₯ Comprehensive Download Manager with:
- Real-time download progress tracking
- Pause/resume/cancel functionality
- Download history with categorization (active, completed, failed)
- File type information and metadata
- Download speed and time estimation
- Open files and show in folder
- Automatic cleanup of old downloads
- π Mandatory User Agreement - One-time comprehensive disclosure covering:
- Local data storage (no cloud transmission)
- Model behavior and content disclaimers
- AI model context access
- Zero telemetry or data collection
- π‘οΈ Privacy-First Design - All data stored locally, no external transmission
- π Advanced security hardening with sandboxing and validation
- π·οΈ Smart bookmarking with AI categorization
- π Semantic search across browsing history
- π Enhanced page summarization with readability optimization
- π Multi-language support for UI
- π± Mobile companion app
- π Sync across devices (optional, privacy-preserving)
- Electron - Desktop app framework with embedded Chromium
- React + TypeScript - UI components with modern hooks
- Vite - Fast build tool with Hot Module Replacement
- Tailwind CSS - Utility-first styling
- Zustand - Lightweight state management
- Better-SQLite3 - Local database for history and bookmarks
- Axios - HTTP client for Ollama API communication
- ESLint + Prettier - Code quality and formatting
- Husky - Git hooks for pre-commit checks
- Ollama - Local LLM inference engine (v0.12.9 bundled with application)
- Node.js 18+ (LTS recommended)
- npm or pnpm
Note: Ollama is bundled with the application - no separate installation required!
# Install dependencies
# This automatically downloads Ollama binaries (~1.8GB)
npm install
# If you need to manually download/update Ollama binaries
npm run setup:ollama
# Start development server
# Ollama will start automatically with the app
npm run dev
# Build for production
npm run build
# Package as distributable (includes Ollama)
npm run packageFirst-time setup:
- When you run
npm install, the Ollama binaries will be automatically downloaded - This is a one-time download of ~1.8GB (includes Windows and macOS versions)
- The binaries are stored in
resources/bin/(excluded from git)
The application includes Ollama v0.12.9 bundled for both Windows and macOS. When you first run the app:
- Ollama starts automatically in the background
- Click the Model Manager button in the navigation bar (or press Ctrl/Cmd+M)
- Download your preferred model (e.g., llama3.2, qwen2.5)
- Start chatting with AI or analyzing web pages!
No manual Ollama installation or configuration needed.
- Version: 0.12.9 (Released: November 1, 2025)
- Platforms: Windows (x64), macOS (Intel + Apple Silicon)
- Size: ~1.8GB (includes CUDA, ROCm support for GPU acceleration)
- Update Instructions: To update Ollama, download the latest release from ollama/ollama/releases and replace files in
resources/bin/
open-browser/
βββ src/
β βββ main/ # Electron main process
β β βββ ipc/ # IPC handlers for renderer communication
β β βββ services/ # Backend services (database, ollama)
β β βββ utils/ # Validation and utilities
β βββ renderer/ # React UI
β β βββ components/ # React components (Browser, Chat, etc.)
β β βββ store/ # Zustand state management (browser, chat, models)
β β βββ services/ # Frontend services
β βββ shared/ # Shared types and utilities
βββ .github/ # GitHub configuration and workflows
βββ TECH_BRIEFING.md # Comprehensive technical documentation
See TECH_BRIEFING.md for comprehensive technical documentation including:
- Architecture diagrams
- API integration patterns
- Model registry format
- Security considerations
- Performance optimization
π Active Development - Full-featured browser with comprehensive AI integration, including personality selection, vision models, tool calling, and advanced download management
- Electron + React + TypeScript setup
- Vite build configuration with HMR
- Security hardening implementation
- Browser UI with navigation and multi-tab support
- Tab management (create, close, switch, suspend)
- Tab persistence and crash recovery
- History tracking and searchable sidebar
- Bookmarks management system
- SQLite database integration
- Context menus and keyboard shortcuts
- Code quality tooling (ESLint, Prettier, Husky)
- CI/CD with GitHub Actions
- Ollama service integration with auto-start capability
- Chat interface with streaming message support
- Comprehensive model manager UI with tabs
- Model registry with 12+ pre-configured models
- Vision vs text-only model capability tracking
- Download progress tracking with real-time updates
- Default model selection with persistent storage
- Model metadata display (size, parameters, capabilities)
- GPU acceleration support (automatic detection)
- IPC handlers for secure LLM operations
- Chat and Model state management with Zustand
- AI Personality Selection - 26 personalities across 4 categories
- Context-Aware AI - Smart context optimization with page analysis
- Vision Model Integration - Screenshot capture and analysis
- Tool Calling/Agent Mode - 6 browser interaction tools
- Advanced Reasoning - Chain-of-thought with thinking tokens
- Custom System Prompts - User-defined AI behavior
- Comprehensive Download Manager - Full download lifecycle management
- Mandatory User Agreement - One-time privacy and terms disclosure
- Content Capture Service - Page text and DOM extraction
- Page Context Extraction - Smart content summarization
- Smart bookmarking with AI categorization
- Semantic search across browsing history
- Enhanced page summarization with readability optimization
- Multi-language support for UI
| Shortcut | Action |
|---|---|
Ctrl/Cmd + T |
New tab |
Ctrl/Cmd + W |
Close current tab |
Ctrl + Tab |
Switch to next tab |
Ctrl + Shift + Tab |
Switch to previous tab |
Ctrl/Cmd + R or F5 |
Reload page |
Ctrl/Cmd + H |
Toggle history sidebar |
Ctrl/Cmd + B |
Toggle bookmarks sidebar |
Ctrl/Cmd + M |
Open model manager |
Alt + Left |
Go back |
Alt + Right |
Go forward |
Ctrl/Cmd + Plus |
Zoom in |
Ctrl/Cmd + Minus |
Zoom out |
Ctrl/Cmd + 0 |
Reset zoom |
Ctrl/Cmd + P |
Print page |
Ctrl/Cmd + U |
View page source |
F12 |
Open developer tools |
Escape |
Stop page loading |
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Fork the repository
- Clone your fork:
git clone https://github.com/your-username/browser-llm.git - Install dependencies:
npm install(this will download Ollama binaries automatically) - Create your feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run linting and formatting:
npm run lint:fix && npm run format - Commit your changes with a descriptive message
- Push to your branch (
git push origin feature/amazing-feature) - Open a Pull Request
Note: The resources/bin/ directory is excluded from git. Contributors will automatically download Ollama binaries when running npm install.
The project uses:
- ESLint for code linting
- Prettier for code formatting
- Husky for pre-commit hooks
- lint-staged for running checks on staged files
All PRs must pass the automated checks before merging.
MIT



