Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SparseGPT:大规模语言模型可以一次性准确修剪 #74

Open
ziwang-com opened this issue Jun 5, 2023 · 0 comments
Open

SparseGPT:大规模语言模型可以一次性准确修剪 #74

ziwang-com opened this issue Jun 5, 2023 · 0 comments

Comments

@ziwang-com
Copy link
Owner

https://github.com/IST-DASLab/sparsegpt

CML 2023 论文“SparseGPT:大规模语言模型可以一次性准确修剪”的代码。

arxiv.org/abs/2301.00774

具体来说,它提供了脚本和实现,以便:

评估原始维基文本2,PTB和C4子集上的基线和修剪模型。(, ,datautils.pyopt.pybloom.py)
在 OPT 和 BLOOM 模型上执行非结构化、n:m 和稀疏 + 量化的 SparseGPT 压缩。(, ,sparsegpt.pyopt.pybloom.py)
我们注意到这个 SparseGPT 实现是基于我们的开源 GPTQ 代码

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant