This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
-
Updated
Jan 12, 2021 - Python
This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
Bringing local LLMs to a Minecraft front-end through commands.
LLM Kit - Python Large Language Model Kit for generating data of your choice
Large Multi-Language Models for News Translation
AccIo - Enterprise LLM : Unifying intelligence at your command!
Python-based WebSocket for CLI LLaVA inference.
Effortlessly create and manage your own AI infrastructure with Radiantloom AI. Privacy, security, and flexibility meet ease-of-use in this innovative open-source platform.
Mamba for Vision, Perception and Action
Detailed code explanation of google LLM gemini
In this workshop, we demonstrate how to choose the right container and right instance types, optimize container parameters, and set up the right autoscaling policies and how to use APIs to get recommendations with Amazon SageMaker
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
How to stream LLM responses using AWS API Gateway Websockets and Lambda
HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊 HugNLP will released to @HugAILab
Inference Llama 2 in one file of pure C
Specify what you want it to build, the AI asks for clarification, and then builds it.
Automating the deployment of the Takeoff Server on AWS for LLMs
creating a workflow to train t5 language models
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."