Skip to content
forked from mudler/LocalAI

The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference

License

Notifications You must be signed in to change notification settings

SmithClove/LocalAI

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalAI

LocalAI is an open-source alternative to commercial AI APIs, offering a drop-in replacement that's fully compatible with OpenAI, Anthropic, and ElevenLabs API specifications. Run AI models locally or on-premises with complete data privacy and control. This fork is tailored towards local community action.

Key Features

  • Universal Compatibility: Works as a direct replacement for OpenAI and other major AI service APIs
  • Local Processing: Run LLMs, generate images, and create audio without sending data to external servers
  • Hardware Friendly: Operates on consumer-grade hardware without requiring a GPU
  • Multiple Model Support: Compatible with various AI model families and architectures
  • Privacy-First: Keep your data secure with 100% on-device processing
  • Free & Open Source: Maintained by the community, led by Smith Clove

Use Cases

  • Private AI deployments for enterprises
  • Cost-effective AI development and testing
  • Local AI inferencing for privacy-sensitive applications
  • On-premises AI solutions for regulated industries

Whether you're looking to reduce AI infrastructure costs, protect sensitive data, or maintain complete control over your AI stack, LocalAI provides the tools you need for local AI inferencing without compromising on capabilities.

GitHub Repository

Run the installer script:

curl https://localai.io/install.sh | sh

Or run with docker:

# CPU only image:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu

# Nvidia GPU:
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12

# CPU and GPU image (bigger size):
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

# AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

To load models:

# From the model gallery (see available models with `local-ai models list`, in the WebUI from the model tab, or visiting https://models.localai.io)
local-ai run llama-3.2-1b-instruct:q4_k_m
# Start LocalAI with the phi-2 model directly from huggingface
local-ai run huggingface://TheBloke/phi-2-GGUF/phi-2.Q8_0.gguf
# Install and run a model from the Ollama OCI registry
local-ai run ollama://gemma:2b
# Run a model from a configuration file
local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
# Install and run a model from a standard OCI registry (e.g., Docker Hub)
local-ai run oci://localai/phi-2:latest

πŸ”₯πŸ”₯ Hot topics (looking for help):

  • Multimodal with vLLM and Video understanding
  • Realtime API
  • πŸ”₯πŸ”₯ Distributed, P2P Global community pools
  • WebUI improvements
  • Backends v2
  • Improving UX v2
  • Moderation endpoint
  • Vulkan
  • Anthropic API
  • πŸ’ΈπŸ’ΈπŸ’Έ

If you want to help and contribute, reach out!

πŸ”— Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

πŸ”— Resources

πŸ“– πŸŽ₯ Media, Blogs, Social

❀️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a sponsor.

A huge thank you to our generous sponsors who support this project covering CI expenses.

πŸ“– License

LocalAI is a community-driven Italian project created by the brilliant Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto mudler@localai.io

πŸ™‡ Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

and of course

πŸ€— Contributors

This is a community project, a special thanks to our contributors! πŸ€—

About

The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 57.5%
  • Python 10.6%
  • JavaScript 10.4%
  • C++ 7.5%
  • HTML 5.8%
  • Makefile 4.0%
  • Other 4.2%