This is the official PyTorch implementation of "LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit".
-
Updated
Jul 25, 2024 - Python
This is the official PyTorch implementation of "LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit".
Running large language models on a single GPU for throughput-oriented scenarios.
Skip lists are a type of data structure that allows fast search, insertion, and deletion operations within an ordered sequence of elements. They are a probabilistic alternative to balanced trees and can be viewed as a linked list with multiple levels of additional pointers.
A crowdsourced distributed cluster for AI art and text generation
A collection of high-order functions providing a bevy of new declarative features
🔐 Node/Browser module for TOTP and HOTP generator based on RFC 6238 and RFC 4226 🗝️
Training and inference scripts for Meta's OPT LLM models using the Alpaca Instruct format.
4D reconstruction of developmental trajectories using spherical harmonics
[NeurIPS 2023 - ML for Audio Workshop (Oral)] Zero-shot audio captioning with audio-language model guidance and audio context keywords
Train very large language models in Jax.
Adds (up to date) TweakScale /L patches for Orbit Portal Technologies (OPT).
Data-driven insights on job hunting for international students in the USA, analyzing listings, roles, and trends.
Small benchmark library focused on avoiding optimization/deoptimization pollution between tests by isolating them.
Curated list of open source and openly accessible large language models
Add a description, image, and links to the opt topic page so that developers can more easily learn about it.
To associate your repository with the opt topic, visit your repo's landing page and select "manage topics."