Skip to content
#

internlm

Here are 15 public repositories matching this topic...

This is the official implementation of "LLM-QBench: A Benchmark Towards the Best Practice for Post-training Quantization of Large Language Models", and it is also an efficient LLM compression tool with various advanced compression methods, supporting multiple inference backends.

  • Updated Jun 7, 2024
  • Python

Improve this page

Add a description, image, and links to the internlm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the internlm topic, visit your repo's landing page and select "manage topics."

Learn more