#
64k
Here are 2 public repositories matching this topic...
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
-
Updated
Apr 30, 2024 - Python
Improve this page
Add a description, image, and links to the 64k topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the 64k topic, visit your repo's landing page and select "manage topics."