-
Notifications
You must be signed in to change notification settings - Fork 751
Open
Description
🐛 Describe the bug
Following instructions here https://github.com/pytorch/executorch/blob/main/examples/models/llama/README.md#step-3-run-on-your-computer-to-validate as below:
cmake --preset llm -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=cmake-out
cmake --build cmake-out -j16 --target install --config Release
# Install llama runner
cmake -DCMAKE_INSTALL_PREFIX=cmake-out \
-DBUILD_TESTING=OFF \
-DCMAKE_BUILD_TYPE=Release \
-Bcmake-out/examples/models/llama \
examples/models/llama
cmake --build cmake-out/examples/models/llama -j16 --config Release
gives the following error:
-- Could NOT find tokenizers (missing: tokenizers_DIR)
flatccrt library is not found.
If needed rebuild with the proper options in CMakeLists.txt
etdump library is not found.
If needed rebuild with the proper options in CMakeLists.txt
bundled_program library is not found.
If needed rebuild with the proper options in CMakeLists.txt
extension_data_loader library is not found.
If needed rebuild with the proper options in CMakeLists.txt
extension_flat_tensor library is not found.
If needed rebuild with the proper options in CMakeLists.txt
coreml_util library is not found.
If needed rebuild with the proper options in CMakeLists.txt
coreml_inmemoryfs library is not found.
If needed rebuild with the proper options in CMakeLists.txt
coremldelegate library is not found.
If needed rebuild with the proper options in CMakeLists.txt
mpsdelegate library is not found.
If needed rebuild with the proper options in CMakeLists.txt
neuron_backend library is not found.
If needed rebuild with the proper options in CMakeLists.txt
qnn_executorch_backend library is not found.
If needed rebuild with the proper options in CMakeLists.txt
custom_ops library is not found.
Traceback (most recent call last):
File "/home/ec2-user/actions-runner/_work/executorch/executorch/test-infra/.github/scripts/run_with_env_secrets.py", line 102, in <module>
If needed rebuild with the proper options in CMakeLists.txt
extension_module library is not found.
If needed rebuild with the proper options in CMakeLists.txt
extension_module_static library is not found.
If needed rebuild with the proper options in CMakeLists.txt
extension_tensor library is not found.
If needed rebuild with the proper options in CMakeLists.txt
extension_training library is not found.
If needed rebuild with the proper options in CMakeLists.txt
xnnpack_backend library is not found.
If needed rebuild with the proper options in CMakeLists.txt
vulkan_backend library is not found.
If needed rebuild with the proper options in CMakeLists.txt
optimized_kernels library is not found.
If needed rebuild with the proper options in CMakeLists.txt
optimized_portable_kernels library is not found.
If needed rebuild with the proper options in CMakeLists.txt
cpublas library is not found.
If needed rebuild with the proper options in CMakeLists.txt
eigen_blas library is not found.
If needed rebuild with the proper options in CMakeLists.txt
optimized_ops_lib library is not found.
If needed rebuild with the proper options in CMakeLists.txt
optimized_native_cpu_ops_lib library is not found.
If needed rebuild with the proper options in CMakeLists.txt
quantized_kernels library is not found.
If needed rebuild with the proper options in CMakeLists.txt
quantized_ops_lib library is not found.
If needed rebuild with the proper options in CMakeLists.txt
quantized_ops_aot_lib library is not found.
If needed rebuild with the proper options in CMakeLists.txt
torchao_ops_executorch library is not found.
If needed rebuild with the proper options in CMakeLists.txt
torchao_kernels_aarch64 library is not found.
If needed rebuild with the proper options in CMakeLists.txt
CMake Error at runner/CMakeLists.txt:43 (message):
ExecuTorch must be installed with EXECUTORCH_BUILD_EXTENSION_LLM_RUNNER
enabled.
-- Configuring incomplete, errors occurred!
main()
File "/home/ec2-user/actions-runner/_work/executorch/executorch/test-infra/.github/scripts/run_with_env_secrets.py", line 98, in main
run_cmd_or_die(f"docker exec -t {container_name} /exec")
File "/home/ec2-user/actions-runner/_work/executorch/executorch/test-infra/.github/scripts/run_with_env_secrets.py", line 39, in run_cmd_or_die
raise RuntimeError(f"Command {cmd} failed with exit code {exit_code}")
RuntimeError: Command docker exec -t f9030e45234351e8ebab7800d46d733535e6ee9570407b761dd2d3902bf775b5 /exec failed with exit code 1
Error: Process completed with exit code 1.
It works if I don't use the preset and instead do:
cmake -DPYTHON_EXECUTABLE=python \
-DCMAKE_INSTALL_PREFIX=cmake-out \
-DEXECUTORCH_ENABLE_LOGGING=1 \
-DCMAKE_BUILD_TYPE=Release \
-DEXECUTORCH_BUILD_EXTENSION_DATA_LOADER=ON \
-DEXECUTORCH_BUILD_EXTENSION_FLAT_TENSOR=ON \
-DEXECUTORCH_BUILD_EXTENSION_MODULE=ON \
-DEXECUTORCH_BUILD_EXTENSION_TENSOR=ON \
-DEXECUTORCH_BUILD_XNNPACK=ON \
-DEXECUTORCH_BUILD_KERNELS_QUANTIZED=ON \
-DEXECUTORCH_BUILD_KERNELS_OPTIMIZED=ON \
-DEXECUTORCH_BUILD_EXTENSION_LLM_RUNNER=ON \
-DEXECUTORCH_BUILD_EXTENSION_LLM=ON \
-DEXECUTORCH_BUILD_KERNELS_LLM=ON \
-Bcmake-out .
cmake --build cmake-out -j16 --config Release --target install
cmake -DPYTHON_EXECUTABLE=python \
-DCMAKE_BUILD_TYPE=Release \
-Bcmake-out/examples/models/llama \
examples/models/llama
cmake --build cmake-out/examples/models/llama -j16 --config Release
Versions
CI job from #14074
Metadata
Metadata
Assignees
Labels
No labels