Skip to content

CodeGeekR/Run-Locally-Ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

RunLocallyAi ⚡️

The Ultimate Way to Run AI Locally on Mac (Apple Silicon & Intel)

macOS 13.0+ SwiftUI

RunLocallyAi is the definitive open-source application designed to help you run AI locally with zero friction. Stop guessing which AI models can be run locally on your specific hardware. Our intelligent benchmark engine analyzes your Mac’s CPU, GPU, NPU, and Memory Bandwidth to recommend and install the best LLM to run locally based on real-time data.

como ejecutar una Ai en local en mi mac mini

🌟 Why RunLocallyAi?

Most users don't know how to run ai locally on mac or which model size fits their RAM. RunLocallyAi solves this by bridging the gap between hardware power and AI performance. Whether you have a Mac Mini, Studio, or MacBook Pro, we ensure you run ai at peak efficiency.

Key Benefits:

  • 🔒 Privacy First: All data stays on your machine. Running ai locally means no cloud leaks.
  • 💸 Cost Effective: Forget expensive subscriptions. Run ai on mac for free, forever.
  • ⚡️ Optimized Performance: We identify the best way to run llm on mac by matching your NPU/GPU specs with the most efficient model quantizations. Zero-copy inference directly on Unified Memory.

🛠 Features & Capabilities

1. Smart Hardware Benchmark (Scoring Engine)

Our algorithm performs a deep-dive analysis of your system using low-level APIs (sysctl, IOKit, Metal) to answer: "Which local ai model is best for me?"

  • NPU Optimization: Full support for Apple Silicon Neural Engine (ANE).
  • Memory Bandwidth Scaling: Precise LLM recommendations based on your unified memory (UMA) capacity and bandwidth.
  • VRAM Fit Score: Deterministic algorithm that prevents Out-Of-Memory kernel panics.

2. Specialized AI Categories

Download and run ai models locally tailored for your specific needs, straight from our frequently updated catalog:

  • 💻 Coding & Agents: The best llm to run locally for developers (e.g., Qwen Coder, DeepSeek).
  • 🧠 Reasoning: High-logic models for complex problem solving (O1-likes).
  • 🎨 Creative Suite: Includes local ai image generator mac (Flux, SDXL), local ai video generator mac (Wan, LTX), and text-to-speech tools.

3. One-Click Installation

Wondering how to run a local llm? Simply choose a category, and RunLocallyAi handles the download via HuggingFace Hub APIs and environment setup. It’s the easiest way to install llm locally mac.


🖥 User Interface (Liquid Glass Aesthetic)

Experience a stunning UI built for modern macOS (Ventura 13.0+). RunLocallyAi isn't just a technical tool; it's a native-feeling macOS experience.

  • Liquid Glass: Translucent materials, dynamic blur layers, and glassmorphism powered by NSVisualEffectView.
  • Visual Benchmarks: See your NPU, GPU load, and Memory Bandwidth in real-time.
  • Smart Library: Manage all your locally run ai models in one beautiful dashboard.

🚀 Quick Start: How to Run AI Locally

  1. Download: Get the latest .dmg from the Releases page. (Universal Binary for Apple Silicon & Intel).
  2. Benchmark: Open the app and let it analyze your mac run llm capabilities.
  3. Select: Choose from the recommended list (e.g., Llama 3, Qwen, or a specialized local ai image generator).
  4. Run: Click "Execute" and start chatting or generating locally.

🏗 Technical Architecture

  • 100% Native: Built with SwiftUI and SwiftData.
  • Zero-Copy Inference: Integrates MLX-Swift for direct Metal/GPU execution without C++ translation overhead.
  • Universal Binary: Compiled for both arm64 and x86_64 without runtime architectural assumptions.

🤝 Contributing

We are building the best way to run llm on mac. If you are an expert in running ai models on mac or macOS development, your contributions are welcome!

  1. Fork the Project.
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature).
  3. Commit your Changes (git commit -m 'Add some AmazingFeature').
  4. Push to the Branch (git push origin feature/AmazingFeature).
  5. Open a Pull Request.

📄 License

Distributed under the Creative Commons Legal Code License. See LICENSE for more information.

RunLocallyAi — The smartest way to run local ai on the world's best hardware.

About

The Ultimate Way to Run AI Locally on Mac (Apple Silicon & Intel)

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors