Native AI Integration for the Modern Web
Seamlessly integrate powerful language models directly into web applications with zero latency, complete privacy, and native performance on macOS.
Requires macOS® 26 Tahoe and Apple Intelligence®
Run language models locally with no network delays. Instant responses for real-time applications and interactive experiences.
Your data never leaves your device. Process sensitive information locally without sending it to external APIs or cloud services.
Leverage Apple's FoundationModels framework for optimized performance on macOS with hardware acceleration.
Simple JavaScript API that works with any web framework. Add AI capabilities to existing applications in minutes.
Stream responses token by token for responsive user experiences. Perfect for chat interfaces and live content generation.
Clean, modern API with TypeScript support, comprehensive documentation, and easy integration patterns.
This repository contains all the components needed to run Native Foundation Models on your Mac:
The macOS installer application that sets up Native Foundation Models on your system. This SwiftUI app handles:
- Installation of the native binary to
~/bin
- Configuration of native messaging host
- Chrome extension installation guidance
- System requirements verification
The Chrome extension that bridges web applications with the native host. Features include:
- JavaScript API (
window.nativeFoundationModels
) - Automatic connection management
- Error handling and retries
- TypeScript type definitions
The project website and documentation, including:
- Interactive demos
- API documentation
- Integration examples
- Getting started guide
- Download the installer: Download NativeFoundationModels.zip
- Run the macOS app: Open the downloaded app and follow the installation steps
- Install the Chrome extension: The app will guide you to install the browser extension
- Start coding: Use the simple JavaScript API in your web applications
// Check if Native Foundation Models is available (OpenAI-compatible)
const status = await window.nativeFoundationModels.checkAvailability();
if (status.available) {
console.log('Ready to use!');
}
// Generate content (OpenAI-compatible format)
const result = await window.nativeFoundationModels.getCompletion('Explain quantum computing');
console.log(result.choices[0].message.content);
// Stream responses (yields OpenAI-compatible chunks)
const stream = await window.nativeFoundationModels.getCompletionStream('Write a story');
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
updateUI(content); // Extract content from chunk
}
}
Each component has its own build process and README:
- macOS Container App: Xcode project using SwiftUI
- Native App: Swift package with native messaging protocol
- Chrome Extension: JavaScript/TypeScript with manifest v3
- Website: Static HTML with live demos
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
Created by @zats
Built with 🖤 for the developer community