Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No available targets are compatible with triple "wasm32-unknown-unknown-wasm" #63

Closed
davidar opened this issue Apr 23, 2023 · 5 comments

Comments

@davidar
Copy link
Contributor

davidar commented Apr 23, 2023

I'm stuck on the following error:

$ python3 build.py --target webgpu --debug-dump
Load cached module from dist/vicuna-7b-v1/mod_cache_before_build.pkl and skip tracing. You can use --use-cache=0 to retrace
Dump mod to dist/vicuna-7b-v1/debug/mod_before_build.py
Dump mod to dist/vicuna-7b-v1/debug/mod_build_stage.py
Traceback (most recent call last):
  File "/workspaces/web-llm/build.py", line 200, in <module>
    build(mod, ARGS)
  File "/workspaces/web-llm/build.py", line 166, in build
    ex = relax.build(mod_deploy, args.target)
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/relax/vm_build.py", line 325, in build
    return _vmlink(builder, target, tir_mod, ext_libs, params)
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/relax/vm_build.py", line 239, in _vmlink
    lib = tvm.build(tir_mod, target=target, runtime=_autodetect_system_lib_req(target))
  File "/home/vscode/.local/lib/python3.10/site-packages/tvm/driver/build_module.py", line 281, in build
    rt_mod_host = _driver_ffi.tir_to_runtime(annotated_mods, target_host)
  File "tvm/_ffi/_cython/./packed_func.pxi", line 331, in tvm._ffi._cy3.core.PackedFuncBase.__call__
  File "tvm/_ffi/_cython/./packed_func.pxi", line 262, in tvm._ffi._cy3.core.FuncCall
  File "tvm/_ffi/_cython/./packed_func.pxi", line 251, in tvm._ffi._cy3.core.FuncCall3
  File "tvm/_ffi/_cython/./base.pxi", line 181, in tvm._ffi._cy3.core.CHECK_CALL
tvm._ffi.base.TVMError: Traceback (most recent call last):
  6: TVMFuncCall
  5: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)>::AssignTypedLambda<tvm::{lambda(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)#6}>(tvm::{lambda(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target)#6}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  4: tvm::TIRToRuntime(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> const&, tvm::Target const&)
  3: tvm::codegen::Build(tvm::IRModule, tvm::Target)
  2: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::IRModule, tvm::Target)>::AssignTypedLambda<tvm::codegen::{lambda(tvm::IRModule, tvm::Target)#1}>(tvm::codegen::{lambda(tvm::IRModule, tvm::Target)#1}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  1: tvm::codegen::LLVMModuleNode::Init(tvm::IRModule const&, tvm::Target const&)
  0: tvm::codegen::LLVMTargetInfo::GetOrCreateTargetMachine(bool)
  File "/workspace/tvm/src/target/llvm/llvm_instance.cc", line 302
TVMError: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (target_machine_ != nullptr) is false: No available targets are compatible with triple "wasm32-unknown-unknown-wasm"

I found a reference to this error here, but since I'm using the nightly TVM build recommended in the readme I'm not clear what I can do to fix this.

@LiliumSancta
Copy link

I got a similar error, I tested it both on wsl2 and Ubuntu 22.04

@takfate
Copy link

takfate commented Apr 29, 2023

I have the same problem. It occurs on centos

@dan9070
Copy link

dan9070 commented Apr 30, 2023

Same issue. Really need some help over here.

Identical error.

@tqchen
Copy link
Contributor

tqchen commented Apr 30, 2023

Likely have to do with the llvm shipped do not come with wasm, you can try to swap out the LLVM and build from source in a branch say here https://github.com/mlc-ai/relax

@yongwww
Copy link

yongwww commented May 10, 2023

I encountered the same issue on my Ubuntu22 instance. Building relax from source helps me resolve this issue. you can refer to the guide https://tvm.apache.org/docs/install/from_source.html#developers-get-source-from-github. I also follows the steps in the script https://github.com/mlc-ai/relax/blob/mlc/tests/scripts/task_web_wasm.sh to build web wasm.

@tqchen tqchen closed this as completed Jun 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants