macai (macOS AI) is a simple yet powerful native macOS AI chat client that supports most AI providers: ChatGPT, Claude, xAI (Grok), Google Gemini, Perplexity, Ollama, OpenRouter, and almost any OpenAI-compatible APIs.

Download latest universal binary, notarized by Apple.
Install macai cask with homebrew:
brew install --cask macai
Checkout main branch and open project in Xcode 14.3 or later
Contributions are welcome. Take a look at Issues page to see already added features/bugs before creating new one. If you plan to fix bug or implement a feature, select any of the open unassigned issues, and feel free to start working on it.
You can also support project by funding. This support is very important for me and allows to focus on macai development.
- Native macOS application built with SwiftUI for optimal performance and system integration
- Lightning fast search across all chats, messages, and personas
- Multi-LLM support including:
- OpenAI ChatGPT models (gpt-4o, o1-mini, o1-preview and other)
- Anthropic Claude
- Google Gemini
- xAI Grok
- Perplexity
- OpenRouter
- Local LLMs via Ollama
- Any other OpenAI-compatible API
- Image uploads support for certain APIs and models
- AI Personas with customizable:
- System instructions
- Temperature settings
- Intelligent message handling:
- Streamed responses for real-time interaction
- Adjustable chat context size
- Automatic chat naming
- Rich content support:
- Syntax-highlighted code blocks
- Interactive HTML/CSS/JavaScript preview
- Formatted tables with CSV/JSON export
- LaTeX equation rendering
- 100% local data storage
- No telemetry or usage tracking
- Built-in backup/restore functionality with JSON export
- Complete control over API configurations and keys
- System-native light/dark theme
- Per-chat customizable system instructions
- Clean, native macOS interface
- Minimal resource usage compared to Electron-based alternatives
To run macai with ChatGPT or Claude, you need to have an API token. API token is like password. You need to obtain the API token first to use any commercial LLM API. Most API services offer free credits on registering new account, so you can try most of them for free. Here is how to get API token for all supported services:
- OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key
- Claude: https://docs.anthropic.com/en/api/getting-started
- Google Gemini: https://ai.google.dev/gemini-api/docs/api-key (free models available 🔥)
- xAI Grok: https://docs.x.ai/docs#models
- OpenRouter: https://openrouter.ai/docs/api-reference/authentication#using-an-api-key (> 50 free models 🔥)
If you are new to LLM and don't want to pay for the tokens, take a look Ollama. It supports dozens of OpenSource LLM models that can run locally on Apple M1/M2/M3/M4 Macs.
Run with Ollama
Ollama is the open-source back-end for various LLM models. Run macai with Ollama is easy-peasy:
- Install Ollama from the official website
- Follow installation guides
- After installation, select model (llama3.1 or llama3.2 are recommended) and pull model using command in terminal:
ollama pull <model>
- In macai settings, open API Service tab, add new API service and select type "ollama":
- Select model, and default AI Persona and save
- Test and enjoy!
macOS 13.0 and later (both Intel and Apple chips are supported)
Project is in the active development phase.



API Service, AI Persona and system message are customizable in any chat anytime