中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
-
Updated
Apr 30, 2024 - Python
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
Add a description, image, and links to the 64k topic page so that developers can more easily learn about it.
To associate your repository with the 64k topic, visit your repo's landing page and select "manage topics."