Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mac M2 本地环境启动报错 ggml_metal_init: error: Error #309

Open
2 tasks done
faceAngus opened this issue Apr 30, 2024 · 2 comments
Open
2 tasks done

mac M2 本地环境启动报错 ggml_metal_init: error: Error #309

faceAngus opened this issue Apr 30, 2024 · 2 comments

Comments

@faceAngus
Copy link

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

No response

期望行为 | Expected Behavior

No response

运行环境 | Environment

- OS: mac m2
- NVIDIA Driver:
- CUDA:
- docker:
- docker-compose:
- NVIDIA GPU:
- NVIDIA GPU Memory:

QAnything日志 | QAnything logs

llama_new_context_with_model: n_ctx = 4096
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
ggml_metal_init: allocating
ggml_metal_init: found device: Apple M2 Max
ggml_metal_init: picking default device: Apple M2 Max
ggml_metal_init: using embedded metal library
ggml_metal_init: error: Error Domain=MTLLibraryErrorDomain Code=3 "program_source:155:11: error: unions are not supported in Metal
union {
^
program_source:176:11: error: unions are not supported in Metal
union {
^
program_source:197:11: error: unions are not supported in Metal
union {
^
program_source:219:11: error: unions are not supported in Metal
union {
^
program_source:264:11: error: unions are not supported in Metal
union {
^
program_source:291:11: error: unions are not supported in Metal
union {
^

复现方法 | Steps To Reproduce

bash scripts/run_for_3B_in_M1_mac.sh
启动不起来

备注 | Anything else?

No response

@faceAngus
Copy link
Author

llama_new_context_with_model: failed to initialize Metal backend
[2024-05-01 08:06:44 +0800] [2194] [ERROR] Experienced exception while trying to serve
Traceback (most recent call last):
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 958, in serve_single
worker_serve(monitor_publisher=None, **kwargs)
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 143, in worker_serve
raise e
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 117, in worker_serve
return _serve_http_1(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/server/runners.py", line 223, in _serve_http_1
loop.run_until_complete(app._server_event("init", "before"))
File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1764, in _server_event
await self.dispatch(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 208, in dispatch
return await dispatch
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 183, in _dispatch
raise e
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 167, in _dispatch
retval = await maybe_coroutine
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1315, in _listener
await maybe_coro
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 177, in init_local_doc_qa
local_doc_qa.init_cfg(args=args)
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/core/local_doc_qa.py", line 71, in init_cfg
self.llm: LlamaCPPCustomLLM = LlamaCPPCustomLLM(args)
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/connector/llm/llm_for_llamacpp.py", line 25, in init
self.llm = Llama(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/llama.py", line 337, in init
self._ctx = _LlamaContext(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/_internals.py", line 265, in init
raise ValueError("Failed to create llama_context")
ValueError: Failed to create llama_context
[2024-05-01 08:06:44 +0800] [2194] [INFO] Server Stopped
Traceback (most recent call last):
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 210, in
app.run(host=args.host, port=args.port, single_process=True, access_log=False)
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 215, in run
serve(primary=self) # type: ignore
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/mixins/startup.py", line 958, in serve_single
worker_serve(monitor_publisher=None, **kwargs)
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 143, in worker_serve
raise e
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/worker/serve.py", line 117, in worker_serve
return _serve_http_1(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/server/runners.py", line 223, in _serve_http_1
loop.run_until_complete(app._server_event("init", "before"))
File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1764, in _server_event
await self.dispatch(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 208, in dispatch
return await dispatch
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 183, in _dispatch
raise e
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/signals.py", line 167, in _dispatch
retval = await maybe_coroutine
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/app.py", line 1315, in _listener
await maybe_coro
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/qanything_server/sanic_api.py", line 177, in init_local_doc_qa
local_doc_qa.init_cfg(args=args)
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/core/local_doc_qa.py", line 71, in init_cfg
self.llm: LlamaCPPCustomLLM = LlamaCPPCustomLLM(args)
File "/Users/angus/Desktop/workspase/pyProject/QAnything/qanything_kernel/connector/llm/llm_for_llamacpp.py", line 25, in init
self.llm = Llama(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/llama.py", line 337, in init
self._ctx = _LlamaContext(
File "/Users/angus/anaconda3/envs/qanything-python/lib/python3.10/site-packages/llama_cpp/_internals.py", line 265, in init
raise ValueError("Failed to create llama_context")
ValueError: Failed to create llama_context

@Gavince
Copy link

Gavince commented May 4, 2024

ValueError: Failed to create llama_context

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants