This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
-
Updated
Jan 12, 2021 - Python
This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
LLM Chain for answering questions from documents with citations
HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊 HugNLP will released to @HugAILab
Logical verification of probabilistic/language model 'intuitions'.
Seamlessly integrate powerful language models like ChatGPT into scikit-learn for enhanced text analysis tasks.
Plug in and Play Implementation of Tree of Thoughts: Deliberate Problem Solving with Large Language Models that Elevates Model Reasoning by atleast 70%
langchain open assistant demo using hugging face Hub (Inference API)
Implementation of the StableLM/Pythia/INCITE language models based on nanoGPT. Supports flash attention, LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
🔥 Your private task assistant with GPT 🔥 (1) Ask questions about your documents. (2) Automate tasks.
Specify what you want it to build, the AI asks for clarification, and then builds it.
A high-throughput and memory-efficient inference and serving engine for LLMs
Collection of helpful scripts for working with GGML models
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
This python app generates NIST 800 53 control implementation for each control and generate the CSV file.
A library of data loaders for LLMs made by the community -- to be used with GPT Index and/or LangChain
Pandas AI is a Python library that integrates generative artificial intelligence capabilities into Pandas, making dataframes conversational
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."