digoal
2024-03-23
PostgreSQL , PolarDB , DuckDB , Grok-1 , llama , Mac Studio
1、准备一台mac studio m2 ultra 192GB 8T 24 76 32 顶配. 官方翻新机售价约5万.
《好消息, Mac Studio 可以跑马斯克开源的 Grok-1 开源大语言模型》
《想减少一些ssd磁盘写入来提高寿命? 禁用 MacOS 的 Swap 分区, 开启内存压缩 - 实测有效》
2、下载grok-1代码
git clone --depth 1 https://github.com/xai-org/grok-1.git
3、下载模型文件, 占用空间318GB
下面给出了3种下载方法, 国内用迅雷就可以下载magnet link.
3.1、BT种子文件
Torrent: https://academictorrents.com/details/5f96d43576e3d386c9ba65b883210a393b68210e
3.2、魔力链接, 拷贝下面内容到迅雷即可下载.
Magnet Link:
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
3.3、从Hugging face下载ckpt-0
.
Hugging Face: https://huggingface.co/xai-org/grok-1 / https://huggingface.co/xai-org/grok-1/tree/main/ckpt-0
首先要安装 huggingface-cli
https://github.com/huggingface/huggingface_hub/blob/main/docs/source/en/guides/cli.md
pip install -U "huggingface_hub[cli]"
huggingface-cli --help
usage: huggingface-cli <command> [<args>]
positional arguments:
{env,login,whoami,logout,repo,upload,download,lfs-enable-largefiles,lfs-multipart-upload,scan-cache,delete-cache}
huggingface-cli command helpers
env Print information about the environment.
login Log in using a token from huggingface.co/settings/tokens
whoami Find out which huggingface.co account you are logged in as.
logout Log out
repo {create} Commands to interact with your huggingface.co repos.
upload Upload a file or a folder to a repo on the Hub
download Download files from the Hub
lfs-enable-largefiles
Configure your repository to enable upload of files > 5GB.
scan-cache Scan cache directory.
delete-cache Delete revisions from the cache directory.
options:
-h, --help show this help message and exit
使用huggingface-cli从huggingface下载ckpt-0
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
If you get errors on zsh, then add quotes to ckpt-0/* to prevent shell from interpreting * as wildcard:
huggingface-cli download xai-org/grok-1 --repo-type model --include 'ckpt-0/*' --local-dir checkpoints --local-dir-use-symlinks False
4、将下载的ckpt-0目录放到grok-1代码 checkpoints目录中.
mv ckpt-0 grok-1/checkpoints/
5、测试grok
cd grok-1
pip install -r requirements.txt
python run.py
6、llama.cpp添加了grok-1的支持, 有兴趣可以继续阅读: ggerganov/llama.cpp#6204
https://huggingface.co/xai-org/grok-1
使用 MacBook Pro M1 16G 运行 Llama 2 7B (Apple Silicon 通用方法)
仅需6行代码就能在Mac 本地上跑LLaMa2 模型