A LLaMA2-7b chatbot with memory running on CPU, and optimized using smooth quantization, 4-bit quantization or Intel® Extension For PyTorch with bfloat16.
-
Updated
Feb 27, 2024 - Python
A LLaMA2-7b chatbot with memory running on CPU, and optimized using smooth quantization, 4-bit quantization or Intel® Extension For PyTorch with bfloat16.
Desktop search tool that brings natural language to traditional file search.
📦 Monorepo of Firefox sidebar extensions.
Multimodal vector search of images and videos taken from trail cameras. Demonstrates how build multimodal AI search (text and image) using the Meta AI ImageBind model.
"A private, local OCR solution using Meta's Llama 3.2 Vision model with a Streamlit interface. Processes images entirely offline, supporting formats like JPEG, PNG, and BMP.
A Test-Bot made on Llama-2 LLM
Offline free automatic speech recognition
Tailored Music For Your Videos
llama.cpp 🦙 LLM inference in TypeScript
Emma é um bot de conversação que usa o modelo de IA Llama3.2-vision, levemente modificado para atender as necessidades do criador "kaduu21", mas também serve de template para quem quiser usar
A feature-rich AI chat interface inspired by Meta's AI assistant, built with Next.js and cutting-edge technologies. Realtime weather, news, AI image generation, find images and lots more🔥🤩🤖
Real-time object detection system using Meta Ray-Ban AI Glasses and YOLOv8 Nano, processing Instagram livestream for visual analysis
A conversational AI Chatbot built with NextJS 14 and Meta AI
learn to build AI workflows and agents with Meta's Llama Stack
Leverages Llama 3.2 for Accelerated Computing the Terminal Inference App.
Add a description, image, and links to the meta-ai topic page so that developers can more easily learn about it.
To associate your repository with the meta-ai topic, visit your repo's landing page and select "manage topics."