🏗️ Fine-tune, build, and deploy open-source LLMs easily!
-
Updated
Jun 6, 2024 - Go
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
NAACL '24 (Demo) / MlSys @ NeurIPS '23 - RedCoast: A Lightweight Tool to Automate Distributed Training and Inference
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
Chat with AI large language models running natively in your browser. Enjoy private, server-free, seamless AI conversations.
Documentation for Google's Gen AI site - including the Gemini API and Gemma
JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
Fully-featured, beautiful web interface for Ollama LLMs - built with NextJS. Deploy with a single click.
DeveloperGPT is a LLM-powered command line tool that enables natural language to terminal commands and in-terminal chat.
This repository highlights the LLMs reasoning capabilities of ✨ Mistral / LLaMA-3 / Phi-3 / Gemma / Flan-T5 / GPT-4o ✨ in Targeted Sentiment Analysis in Russian / Translated to English mass-media 📊
Add a description, image, and links to the gemma topic page so that developers can more easily learn about it.
To associate your repository with the gemma topic, visit your repo's landing page and select "manage topics."