-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
量化好几个小时候出现报错,网络问题:ConnectionError #6
Comments
量化完了应该会保存模型的,下次可以直接load,不用重新开始量化了。比如手写一个load_state_dict; |
感谢回复,想知道,我上面那一步有没有到已保存模型的步骤,路径是在?因为好像没看到。 |
这一步是在保存量化完了以后的模型https://github.com/bytedance/decoupleQ/blob/main/llama.py#L496 |
感谢辛苦回复。 2.datasets = ['wikitext2', 'ptb', 'c4']这三个数据集有链接可以下载吗,实在不行的话,也不影响?因为看代码似乎是在做评估,看ppl指标? |
**感谢辛苦回复。 2.datasets = ['wikitext2', 'ptb', 'c4']这三个数据集有链接可以下载吗,实在不行的话,也不影响?因为看代码似乎是在做评估,看ppl指标?** |
推理看一下run_inference_llama.sh ? |
Traceback (most recent call last): linear_w2a16.py |
请问您这个解决了吗?我也出现了这样的问题: |
没有,确实没有看到这个模块,只有请大佬看看了@GuoYi0 |
需要修改一下build.sh,把各种路径指向你实际的工作环境的路径,然后再bash build.sh |
听起来使用不是很友好的样子。cp libdecoupleQ_kernels.so 都没看到哪里有libdecoupleQ_kernels.so,运行,然后自己会生成? |
嗯嗯,cmake的时候会生成 |
我的环境时py3.10行么,您这里是3.9 |
应该可以的,路径都改好就行 |
CMake Error: The source directory "/media/data/xgp/decoupleQ/csrc/build" does not appear to contain CMakeLists.txt. |
我也是,感觉复现的代码有点问题~ |
您可以直接把decoupleQ_kernels这个py文件开源吗?编译的总出问题,或者您受累给个解决方案~我们复现都有问题,我看其他人都是只用到您的推理,没有复现。 |
@chuangzhidan @hsb1995 |
最好是直接在build里改一下对应路径然后直接执行呢 |
time cost for block minimization: 98.45541977882385
quant layer 31 done! time cost 295.21963691711426
The quantization duration is 2.6511569974819817
Traceback (most recent call last):
File "/workspace/decoupleQ/copy_llama.py", line 427, in
dataloader, testloader = get_loaders(
File "/workspace/decoupleQ/datautils.py", line 209, in get_loaders
return get_ptb_new(nsamples, seed, seqlen, model)
File "/workspace/decoupleQ/datautils.py", line 140, in get_ptb_new
traindata = load_dataset('ptb_text_only', 'penn_treebank', split='train')
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 1657, in load_dataset
builder_instance = load_dataset_builder(
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 1494, in load_dataset_builder
dataset_module = dataset_module_factory(
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 1204, in dataset_module_factory
raise e1 from None
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 1148, in dataset_module_factory
).get_module()
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 523, in get_module
local_path = self.download_loading_script(revision)
File "/opt/conda/lib/python3.10/site-packages/datasets/load.py", line 506, in download_loading_script
return cached_path(file_path, download_config=self.download_config)
File "/opt/conda/lib/python3.10/site-packages/datasets/utils/file_utils.py", line 298, in cached_path
output_path = get_from_cache(
File "/opt/conda/lib/python3.10/site-packages/datasets/utils/file_utils.py", line 615, in get_from_cache
raise ConnectionError(f"Couldn't reach {url} ({repr(head_error)})")
ConnectionError: Couldn't reach https://raw.githubusercontent.com/huggingface/datasets/1.17.0/datasets/ptb_text_only/ptb_text_only.py (ReadTimeout(ReadTimeoutError("HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Read timed out. (read timeout=10)")))
1:想知道网络问题如何解决
2. 不知道可不可以把量化和这个访问网络的过程分开?解耦下,不然一整个流程的一个小bug,前面的半天的量化就白跑了
The text was updated successfully, but these errors were encountered: