Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with building and running sherpa-onnx gpu on Windows #878

Closed
tn-17 opened this issue May 15, 2024 · 6 comments
Closed

Issue with building and running sherpa-onnx gpu on Windows #878

tn-17 opened this issue May 15, 2024 · 6 comments

Comments

@tn-17
Copy link

tn-17 commented May 15, 2024

I am unsure if this is an issue with sherpa-onnx gpu installation or onnxruntime-gpu installation.

I am using Windows 11, python 3.10.11.

I have CUDA 12.4, CUDNN 8.9.2.26 and zlib 1.3.1 installed and added to the PATH.

image

I followed the requirement guidelines from: https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements

"ONNX Runtime built with CUDA 11.8 should be compatible with any CUDA 11.x version; ONNX Runtime built with CUDA 12.2 should be compatible with any CUDA 12.x version.

ONNX Runtime built with cuDNN 8.x are not compatible with cuDNN 9.x."

I installed onnxruntime-gpu specifically for CUDA 12.x following the instructions from https://onnxruntime.ai/docs/install

I ran python setup.py install from the sherpa-onnx repo directory following the Method 2 Nvidia GPU (CUDA) install instructions at https://k2-fsa.github.io/sherpa/onnx/python/install.html

  • installation was ran within a python venv

The installation succeeds but running tts-offline-example (https://github.com/k2-fsa/sherpa-onnx/blob/master/python-api-examples/offline-tts-play.py) with --provider cuda results in the following error.

Running offline-tts-play.py example code Logs:

python piper_stream_example.py --vits-model=./en_US-libritts_r-medium.onnx --vits-tokens=./tokens.txt --vits-data-dir=./espeak-ng-data --output-filename=./test.wav --provider cuda --debug True 'This is a test'
Namespace(vits_model='./en_US-libritts_r-medium.onnx', vits_lexicon='', vits_tokens='./tokens.txt', vits_data_dir='./espeak-ng-data', vits_dict_dir='', tts_rule_fsts='', output_filename='./test.wav', sid=0, debug=True, provider='cuda', num_threads=1, speed=1.0, text='This is a test')
2024-05-14 23:13:21,141 INFO [piper_stream_example.py:320] Loading model ...
Traceback (most recent call last):
  File "C:\Users\T\Desktop\Code\ai\stella\piper_stream_example.py", line 383, in <module>
    main()
  File "C:\Users\T\Desktop\Code\ai\stella\piper_stream_example.py", line 321, in main
    tts = sherpa_onnx.OfflineTts(tts_config)
RuntimeError: D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\T\Desktop\Code\ai\stella\venv\lib\site-packages\sherpa_onnx-1.9.24-py3.10-win-amd64.egg\onnxruntime_providers_cuda.dll"

Some errors that appear in the build logs are:
-- Failed to find all ICU components (missing: ICU_INCLUDE_DIR ICU_LIBRARY _ICU_REQUIRED_LIBS_FOUND)
-- Could NOT find ZLIB (missing: ZLIB_INCLUDE_DIR)
-- Could NOT find ASIOSDK (missing: ASIOSDK_ROOT_DIR ASIOSDK_INCLUDE_DIR)

python setup.py install Logs:

$ python setup.py install
running install
C:\Users\T\Desktop\Code\ai\stella\venv\lib\site-packages\setuptools\command\install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
C:\Users\T\Desktop\Code\ai\stella\venv\lib\site-packages\setuptools\command\easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
running bdist_egg
running egg_info
writing sherpa_onnx.egg-info\PKG-INFO
writing dependency_links to sherpa_onnx.egg-info\dependency_links.txt
writing entry points to sherpa_onnx.egg-info\entry_points.txt
writing top-level names to sherpa_onnx.egg-info\top_level.txt
reading manifest file 'sherpa_onnx.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
no previously-included directories found matching 'android'
no previously-included directories found matching 'ios-swift'
no previously-included directories found matching 'ios-swiftui'
adding license file 'LICENSE'
writing manifest file 'sherpa_onnx.egg-info\SOURCES.txt'
installing library code to build\bdist.win-amd64\egg
running install_lib
running build_py
copying sherpa-onnx\python\sherpa_onnx\__init__.py -> build\lib.win-amd64-cpython-310\sherpa_onnx
running build_ext
Setting PYTHON_EXECUTABLE to C:\Users\T\Desktop\Code\ai\stella\venv\Scripts\python.exe
build command is:

         cmake -DSHERPA_ONNX_ENABLE_GPU=ON -DPYTHON_EXECUTABLE=C:\Users\T\Desktop\Code\ai\stella\venv\Scripts\python.exe -DCMAKE_INSTALL_PREFIX=C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx  -DBUILD_SHARED_LIBS=ON  -DBUILD_PIPER_PHONMIZE_EXE=OFF  -DBUILD_PIPER_PHONMIZE_TESTS=OFF  -DBUILD_ESPEAK_NG_EXE=OFF  -DBUILD_ESPEAK_NG_TESTS=OFF  -DSHERPA_ONNX_ENABLE_CHECK=OFF  -DSHERPA_ONNX_ENABLE_PYTHON=ON  -DSHERPA_ONNX_ENABLE_PORTAUDIO=ON  -DSHERPA_ONNX_ENABLE_WEBSOCKET=ON  -B build\temp.win-amd64-cpython-310\Release -S C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx
         cmake --build build\temp.win-amd64-cpython-310\Release --target install --config Release -- -m

-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631.
CMake Warning at CMakeLists.txt:71 (message):
  Compiling for NVIDIA GPU is enabled.  Please make sure cudatoolkit

  is installed on your system.  Otherwise, you will get errors at runtime.

  Hint: You don't need sudo permission to install CUDA toolkit.  Please refer
  to

    https://k2-fsa.github.io/k2/installation/cuda-cudnn.html

  to install CUDA toolkit if you have not installed it.


-- CMAKE_BUILD_TYPE: Release
-- CMAKE_INSTALL_PREFIX: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx
-- BUILD_SHARED_LIBS ON
-- SHERPA_ONNX_ENABLE_PYTHON ON
-- SHERPA_ONNX_ENABLE_TESTS OFF
-- SHERPA_ONNX_ENABLE_CHECK OFF
-- SHERPA_ONNX_ENABLE_PORTAUDIO ON
-- SHERPA_ONNX_ENABLE_JNI OFF
-- SHERPA_ONNX_ENABLE_C_API ON
-- SHERPA_ONNX_ENABLE_WEBSOCKET ON
-- SHERPA_ONNX_ENABLE_GPU ON
-- SHERPA_ONNX_ENABLE_WASM OFF
-- SHERPA_ONNX_ENABLE_WASM_TTS OFF
-- SHERPA_ONNX_ENABLE_WASM_ASR OFF
-- SHERPA_ONNX_ENABLE_WASM_KWS OFF
-- SHERPA_ONNX_ENABLE_WASM_NODEJS OFF
-- SHERPA_ONNX_ENABLE_BINARY ON
-- SHERPA_ONNX_ENABLE_TTS ON
-- SHERPA_ONNX_LINK_LIBSTDCPP_STATICALLY ON
-- SHERPA_ONNX_USE_PRE_INSTALLED_ONNXRUNTIME_IF_AVAILABLE ON
-- TTS is enabled
-- C++ Standard version: 14
-- Disabled warnings: /wd4244;/wd4267;/wd4305;/wd4334;/wd4800;/wd4996
-- Downloading kaldi-native-fbank from https://github.com/csukuangfj/kaldi-native-fbank/archive/refs/tags/v1.19.1.tar.gz
-- kaldi-native-fbank is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldi_native_fbank-src
-- kaldi-native-fbank's binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldi_native_fbank-build       
CMake Deprecation Warning at build/temp.win-amd64-cpython-310/Release/_deps/kaldi_native_fbank-src/CMakeLists.txt:24 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- CMAKE_BUILD_TYPE: Release
-- CMAKE_EXPORT_COMPILE_COMMANDS:
-- BUILD_SHARED_LIBS: ON
-- KALDI_NATIVE_FBANK_BUILD_TESTS: OFF
-- KALDI_NATIVE_FBANK_BUILD_PYTHON: OFF
-- KALDI_NATIVE_FBANK_ENABLE_CHECK: OFF
-- KALDI_NATIVE_FBANK_ENABLE_CHECK: OFF
-- CMAKE_CXX_FLAGS: /DWIN32 /D_WINDOWS /W3 /GR /EHsc /wd4244  /wd4267  /wd4305  /wd4334  /wd4800  /wd4996
-- CMAKE_INSTALL_PREFIX: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx
-- Disabled warnings: /wd4244;/wd4267;/wd4624
-- Disable building Python
-- Downloading kaldi-decoder from https://github.com/k2-fsa/kaldi-decoder/archive/refs/tags/v0.2.5.tar.gz
-- kaldi-decoder is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldi_decoder-src
-- kaldi-decoder's binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldi_decoder-build
-- CMAKE_BUILD_TYPE: Release
-- Downloading kaldifst from https://github.com/k2-fsa/kaldifst/archive/refs/tags/v1.7.10.tar.gz
-- kaldifst is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldifst-src
-- kaldifst's binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/kaldifst-build
-- CMAKE_BUILD_TYPE: Release
-- Disable building shared libraries for Windows
-- CMAKE_BUILD_TYPE: Release
-- CMAKE_INSTALL_PREFIX: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx
-- BUILD_SHARED_LIBS OFF
-- C++ Standard version: 14
-- Disabled warnings: /wd4018;/wd4244;/wd4267;/wd4291;/wd4305;/wd4996
-- CMAKE_CXX_FLAGS: /DWIN32 /D_WINDOWS /W3 /GR /EHsc /wd4244  /wd4267  /wd4305  /wd4334  /wd4800  /wd4996  /wd4018  /wd4244  /wd4267  /wd4291  /wd4305  /wd4996   
-- Downloading openfst from https://github.com/csukuangfj/openfst/archive/refs/tags/sherpa-onnx-2024-04-09.tar.gz
-- openfst is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src
-- The following ICU libraries were not found:
--   data (required)
--   i18n (required)
--   io (required)
--   test (required)
--   tu (required)
--   uc (required)
-- Failed to find all ICU components (missing: ICU_INCLUDE_DIR ICU_LIBRARY _ICU_REQUIRED_LIBS_FOUND)
-- Could NOT find ZLIB (missing: ZLIB_INCLUDE_DIR) 
CMake Deprecation Warning at build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/CMakeLists.txt:15 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/arc-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/arciterator-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/arcsort.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/arg-packs.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/closure.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/compile-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/compile.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/compose.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/concat.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/connect.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/convert.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/decode.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/determinize.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/difference.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/disambiguate.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/draw-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/draw.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/encode.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/encodemapper-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/epsnormalize.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/equal.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/equivalent.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/fst-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/fstscript-decl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/fstscript.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/getters.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/info-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/info.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/intersect.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/invert.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/isomorphic.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/map.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/minimize.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/print-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/print.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/project.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/prune.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/push.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/randequivalent.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/randgen.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/register.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/relabel.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/replace.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/reverse.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/reweight.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/rmepsilon.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/script-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/shortest-distance.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/shortest-path.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/stateiterator-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/synchronize.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/text-io.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/topsort.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/union.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/verify.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/script/../include/fst/script/weight-class.h
-- C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/compile-strings.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/create.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/equal.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/extract.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/far-class.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/far.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/farlib.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/farscript.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/getters.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/info.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/isomorphic.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/print-strings.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/script-impl.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/stlist.h;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/openfst-src/src/extensions/far/../../include/fst/extensions/far/sttable.h
-- Downloading eigen from https://gitlab.com/libeigen/eigen/-/archive/3.4.0/eigen-3.4.0.tar.gz
-- eigen is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/eigen-src
-- eigen's binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/eigen-build
-- Performing Test COMPILER_SUPPORT_std=cpp03
-- Performing Test COMPILER_SUPPORT_std=cpp03 - Failed
-- Standard libraries to link to explicitly: none
-- Found unsuitable Qt version "" from NOTFOUND
-- Qt4 not found, so disabling the mandelbrot and opengl demos
-- 
-- Configured Eigen 3.4.0
--
-- Available targets (use: cmake --build . --target TARGET):
-- ---------+--------------------------------------------------------------
-- Target   |   Description
-- ---------+--------------------------------------------------------------
-- install  | Install Eigen. Headers will be installed to:
--          |     <CMAKE_INSTALL_PREFIX>/<INCLUDE_INSTALL_DIR>
--          |   Using the following values:
--          |     CMAKE_INSTALL_PREFIX: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx
--          |     INCLUDE_INSTALL_DIR:  include/eigen3
--          |   Change the install location of Eigen headers using:
--          |     cmake . -DCMAKE_INSTALL_PREFIX=yourprefix
--          |   Or:
--          |     cmake . -DINCLUDE_INSTALL_DIR=yourdir
-- doc      | Generate the API documentation, requires Doxygen & LaTeX
-- blas     | Build BLAS library (not the same thing as Eigen)
-- uninstall| Remove files installed by the install target
-- ---------+--------------------------------------------------------------
--
-- CMAKE_SYSTEM_NAME: Windows
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- location_onnxruntime_header_dir: location_onnxruntime_header_dir-NOTFOUND
-- location_onnxruntime_lib: location_onnxruntime_lib-NOTFOUND
-- location_onnxruntime_cuda_lib: location_onnxruntime_cuda_lib-NOTFOUND
-- Could not find a pre-installed onnxruntime.
-- Downloading pre-compiled onnxruntime
-- CMAKE_SYSTEM_NAME: Windows
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_VS_PLATFORM_NAME: x64
-- Use dynamic onnxruntime libraries
-- CMAKE_SYSTEM_NAME: Windows
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_VS_PLATFORM_NAME: x64
-- Downloading onnxruntime from https://github.com/microsoft/onnxruntime/releases/download/v1.17.1/onnxruntime-win-x64-gpu-1.17.1.zip
-- onnxruntime is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src
-- location_onnxruntime: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime.lib
-- location_onnxruntime_providers_cuda_lib: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime_providers_cuda.lib
-- location_onnxruntime_providers_shared_lib: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime_providers_shared.lib
-- onnxruntime lib files: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime.dll;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime_providers_cuda.dll;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime_providers_shared.dll;C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src/lib/onnxruntime_providers_tensorrt.dll
-- ONNXRUNTIME_DIR: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src
-- Downloading portaudio from http://files.portaudio.com/archives/pa_stable_v190700_20210406.tgz
-- portaudio is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/portaudio-src
-- portaudio's binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/portaudio-build
CMake Deprecation Warning at build/temp.win-amd64-cpython-310/Release/_deps/portaudio-src/CMakeLists.txt:7 (CMAKE_MINIMUM_REQUIRED):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- Could NOT find ASIOSDK (missing: ASIOSDK_ROOT_DIR ASIOSDK_INCLUDE_DIR)
-- Downloading pybind11 from https://github.com/pybind/pybind11/archive/refs/tags/v2.10.2.tar.gz
-- pybind11 is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src
CMake Deprecation Warning at build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src/CMakeLists.txt:8 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- pybind11 v2.10.2 
CMake Warning (dev) at build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src/tools/FindPythonLibsNew.cmake:98 (find_package):
  Policy CMP0148 is not set: The FindPythonInterp and FindPythonLibs modules
  are removed.  Run "cmake --help-policy CMP0148" for policy details.  Use
  the cmake_policy command to set the policy and suppress this warning.

Call Stack (most recent call first):
  build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src/tools/pybind11Tools.cmake:50 (find_package)
  build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src/tools/pybind11Common.cmake:180 (include)
  build/temp.win-amd64-cpython-310/Release/_deps/pybind11-src/CMakeLists.txt:208 (include)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Downloading websocketpp from https://github.com/zaphoyd/websocketpp/archive/b9aeec6eaf3d5610503439b4fae3581d9aff08e8.zip
-- websocketpp is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/websocketpp-src
-- Downloading asio https://github.com/chriskohlhoff/asio/archive/refs/tags/asio-1-24-0.tar.gz
-- asio is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/asio-src
-- Downloading espeak-ng from https://github.com/csukuangfj/espeak-ng/archive/69bf6927964fb042aeb827cfdf6082a30f5802eb.zip
-- espeak-ng is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/espeak_ng-src
-- espeak-ng binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/espeak_ng-build
-- Configuration:
--   shared: ON
--   mbrola: OFF (MBROLA_BIN-NOTFOUND)
--   libsonic: OFF (SONIC_LIB-NOTFOUND SONIC_INC-NOTFOUND)
--   libpcaudio: OFF (PCAUDIO_LIB-NOTFOUND PCAUDIO_INC-NOTFOUND)
--   klatt: OFF
--   speech-player: OFF
--   async: OFF
-- ESPEAK_NG_DIR: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/espeak_ng-src
-- Downloading piper-phonemize from https://github.com/csukuangfj/piper-phonemize/archive/dc6b5f4441bffe521047086930b0fc12686acd56.zip
-- piper-phonemize is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/piper_phonemize-src
-- piper-phonemize binary dir is C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/piper_phonemize-build
-- ESPEAK_NG_DIR: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/espeak_ng-src
-- ONNXRUNTIME_DIR: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/onnxruntime-src
-- Downloading cppjieba https://github.com/csukuangfj/cppjieba/archive/refs/tags/sherpa-onnx-2024-04-19.tar.gz
-- cppjieba is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/cppjieba-src
CMake Deprecation Warning at build/temp.win-amd64-cpython-310/Release/_deps/cppjieba-src/CMakeLists.txt:3 (cmake_minimum_required):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- PYTHON_EXECUTABLE: C:/Users/T/Desktop/Code/ai/stella/venv/Scripts/python.exe
-- PYTHON_VERSION: 3.10
-- Downloading cargs https://github.com/likle/cargs/archive/refs/tags/v1.0.3.tar.gz
-- cargs is downloaded to C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release/_deps/cargs-src
-- CMAKE_CXX_FLAGS: /DWIN32 /D_WINDOWS /W3 /GR /EHsc /wd4244  /wd4267  /wd4305  /wd4334  /wd4800  /wd4996
-- CMAKE_CXX_FLAGS: /DWIN32 /D_WINDOWS /W3 /GR /EHsc /wd4244  /wd4267  /wd4305  /wd4334  /wd4800  /wd4996 
-- Configuring done (6.9s)
-- Generating done (2.4s)
-- Build files have been written to: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/temp.win-amd64-cpython-310/Release
MSBuild version 17.8.3+195e7f5a3 for .NET Framework

  fst.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\lib\Release\sherpa-onnx-fst.lib
  Auto build dll exports
  Auto build dll exports
  fstfar.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\lib\Release\sherpa-onnx-fstfar.lib
  kaldifst_core.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\lib\Release\sherpa-onnx-kaldifst-core.lib       
  kaldi-native-fbank-core.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\kaldi-native-fbank-core.d
  ll
  ucd.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\ucd.dll
  Auto build dll exports
  Auto build dll exports
  kaldi-decoder-core.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\kaldi-decoder-core.dll
  espeak-ng.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\espeak-ng.dll
  Auto build dll exports
  piper_phonemize.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\piper_phonemize.dll
  Auto build dll exports
  sherpa-onnx-core.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-core.dll
  _sherpa_onnx.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\lib\Release\_sherpa_onnx.cp310-win_amd64.pyd
  Auto build dll exports
  sherpa-onnx-keyword-spotter.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-keyword-s 
  potter.exe
  sherpa-onnx.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx.exe
  Auto build dll exports
  Auto build dll exports
  sherpa-onnx-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-c-api.dll
  portaudio.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-portaudio.dll
  audio-tagging-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\audio-tagging-c-api.exe       
  add-punctuation-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\add-punctuation-c-api.exe
  cargs.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\cargs.dll
  sherpa-onnx-microphone.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-microphone.exe
  sherpa-onnx-keyword-spotter-microphone.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onn 
  x-keyword-spotter-microphone.exe
  decode-file-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\decode-file-c-api.exe
  offline-tts-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\offline-tts-c-api.exe
  spoken-language-identification-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\spoken-langu
  age-identification-c-api.exe
  sherpa-onnx-offline-language-identification.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherp 
  a-onnx-offline-language-identification.exe
  speaker-identification-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\speaker-identificati 
  on-c-api.exe
  sherpa-onnx-microphone-offline.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-microp
  hone-offline.exe
  sherpa-onnx-vad-microphone-offline-asr.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onn 
  x-vad-microphone-offline-asr.exe
  sherpa-onnx-offline-tts.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-offline-tts.e 
  xe
  streaming-hlg-decode-file-c-api.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\streaming-hlg-dec 
  ode-file-c-api.exe
  sherpa-onnx-online-websocket-client.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-o
  nline-websocket-client.exe
  sherpa-onnx-offline-punctuation.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-offli 
  ne-punctuation.exe
  sherpa-onnx-offline-parallel.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-offline- 
  parallel.exe
  sherpa-onnx-offline-audio-tagging.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-off 
  line-audio-tagging.exe
  sherpa-onnx-microphone-offline-audio-tagging.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sher 
  pa-onnx-microphone-offline-audio-tagging.exe
  sherpa-onnx-microphone-offline-speaker-identification.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Rel
  ease\sherpa-onnx-microphone-offline-speaker-identification.exe
  sherpa-onnx-vad-microphone.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-vad-microp 
  hone.exe
  sherpa-onnx-offline.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-offline.exe       
  sherpa-onnx-offline-tts-play.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-offline- 
  tts-play.exe
  sherpa-onnx-online-websocket-server.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx-o 
  nline-websocket-server.exe
  sherpa-onnx-offline-websocket-server.vcxproj -> C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\temp.win-amd64-cpython-310\Release\bin\Release\sherpa-onnx- 
  offline-websocket-server.exe
  1>
  -- Install configuration: "Release"
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../kaldi-native-fbank-core.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../kaldi-native-fbank-core.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/kaldi-native-fbank-core.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/kaldi-native-fbank-core.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../kaldi-decoder-core.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../kaldi-decoder-core.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-kaldifst-core.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-fst.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-fstfar.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/kaldi-decoder-core.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/kaldi-decoder-core.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-kaldifst-core.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-fst.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-fstfar.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../onnxruntime.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../onnxruntime_providers_cuda.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../onnxruntime_providers_shared.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../onnxruntime_providers_tensorrt.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/onnxruntime.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/onnxruntime_providers_cuda.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/onnxruntime_providers_shared.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/onnxruntime_providers_tensorrt.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-portaudio.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-portaudio.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../espeak-ng.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../espeak-ng.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../ucd.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../ucd.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/espeak-ng.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/espeak-ng.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/ucd.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/ucd.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../piper_phonemize.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../piper_phonemize.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/piper_phonemize.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/piper_phonemize.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/./sherpa-onnx.pc
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/pkgconfig/espeak-ng.pc
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/share/vim/vimfiles/ftdetect
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/share/vim/vimfiles/ftdetect/espeakfiletype.vim
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/share/vim/vimfiles/syntax
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/share/vim/vimfiles/syntax/espeaklist.vim
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/share/vim/vimfiles/syntax/espeakrules.vim
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/ucd.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/ucd.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/espeak-ng.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/espeak-ng.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-core.lib
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../sherpa-onnx-core.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-core.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-core.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-keyword-spotter.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-audio-tagging.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-language-identification.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-parallel.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-punctuation.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-tts.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-microphone.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-keyword-spotter-microphone.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-microphone-offline.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-microphone-offline-speaker-identificat
  ion.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-microphone-offline-audio-tagging.exe   
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-vad-microphone.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-vad-microphone-offline-asr.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-tts-play.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-online-websocket-server.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-online-websocket-client.exe
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/bin/sherpa-onnx-offline-websocket-server.exe
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/../_sherpa_onnx.cp310-win_amd64.pyd
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/sherpa-onnx-c-api.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/sherpa-onnx-c-api.dll
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/include/sherpa-onnx/c-api/c-api.h
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/cargs.lib
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/cargs.dll
  -- Installing: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/lib/cargs.h
  -- Up-to-date: C:/Users/T/Desktop/Code/ai/stella/sherpa-onnx/build/lib.win-amd64-cpython-310/sherpa_onnx/include/cargs.h
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-keyword-spotter.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-microphone.exe to build\sherpa_onnx\bin/        
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-microphone-offline.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-microphone-offline-audio-tagging.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-microphone-offline-speaker-identification.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-audio-tagging.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-language-identification.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-punctuation.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-tts.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-tts-play.exe to build\sherpa_onnx\bin/  
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-offline-websocket-server.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-online-websocket-client.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-online-websocket-server.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-vad-microphone.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-vad-microphone-offline-asr.exe to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\espeak-ng.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\kaldi-decoder-core.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\kaldi-native-fbank-core.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\onnxruntime.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\piper_phonemize.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\lib\sherpa-onnx-c-api.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-core.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-fstfar.lib to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-fst.lib to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\sherpa-onnx-kaldifst-core.lib to build\sherpa_onnx\bin/     
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\..\sherpa-onnx-portaudio.dll to build\sherpa_onnx\bin/
Copying C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\build\lib.win-amd64-cpython-310\sherpa_onnx\bin\ucd.dll to build\sherpa_onnx\bin/
creating build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\espeak-ng.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\espeak-ng.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\kaldi-decoder-core.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\kaldi-decoder-core.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\kaldi-native-fbank-core.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\kaldi-native-fbank-core.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\onnxruntime.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\onnxruntime_providers_cuda.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\onnxruntime_providers_shared.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\onnxruntime_providers_tensorrt.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\piper_phonemize.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\piper_phonemize.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-core.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-core.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-fst.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-fstfar.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-kaldifst-core.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-portaudio.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\sherpa-onnx-portaudio.lib -> build\bdist.win-amd64\egg
creating build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\cli.py -> build\bdist.win-amd64\egg\sherpa_onnx
creating build\bdist.win-amd64\egg\sherpa_onnx\include
copying build\lib.win-amd64-cpython-310\sherpa_onnx\include\cargs.h -> build\bdist.win-amd64\egg\sherpa_onnx\include
creating build\bdist.win-amd64\egg\sherpa_onnx\include\sherpa-onnx
creating build\bdist.win-amd64\egg\sherpa_onnx\include\sherpa-onnx\c-api
copying build\lib.win-amd64-cpython-310\sherpa_onnx\include\sherpa-onnx\c-api\c-api.h -> build\bdist.win-amd64\egg\sherpa_onnx\include\sherpa-onnx\c-api
copying build\lib.win-amd64-cpython-310\sherpa_onnx\keyword_spotter.py -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\offline_recognizer.py -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\online_recognizer.py -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\sherpa-onnx.pc -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\utils.py -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\sherpa_onnx\__init__.py -> build\bdist.win-amd64\egg\sherpa_onnx
copying build\lib.win-amd64-cpython-310\ucd.dll -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\ucd.lib -> build\bdist.win-amd64\egg
copying build\lib.win-amd64-cpython-310\_sherpa_onnx.cp310-win_amd64.pyd -> build\bdist.win-amd64\egg
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\cli.py to cli.cpython-310.pyc
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\keyword_spotter.py to keyword_spotter.cpython-310.pyc
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\offline_recognizer.py to offline_recognizer.cpython-310.pyc
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\online_recognizer.py to online_recognizer.cpython-310.pyc
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\utils.py to utils.cpython-310.pyc
byte-compiling build\bdist.win-amd64\egg\sherpa_onnx\__init__.py to __init__.cpython-310.pyc
creating stub loader for _sherpa_onnx.cp310-win_amd64.pyd
byte-compiling build\bdist.win-amd64\egg\_sherpa_onnx.py to _sherpa_onnx.cpython-310.pyc
installing package data to build\bdist.win-amd64\egg
running install_data
creating build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-keyword-spotter.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-microphone.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-microphone-offline.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-microphone-offline-audio-tagging.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-microphone-offline-speaker-identification.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-audio-tagging.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-language-identification.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-punctuation.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-tts.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-tts-play.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-offline-websocket-server.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-online-websocket-client.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-online-websocket-server.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-vad-microphone.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-vad-microphone-offline-asr.exe -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\espeak-ng.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\kaldi-decoder-core.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\kaldi-native-fbank-core.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\onnxruntime.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\piper_phonemize.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-c-api.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-core.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-fstfar.lib -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-fst.lib -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-kaldifst-core.lib -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\sherpa-onnx-portaudio.dll -> build\bdist.win-amd64\egg\bin
copying build\sherpa_onnx\bin\ucd.dll -> build\bdist.win-amd64\egg\bin
creating build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\PKG-INFO -> build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\SOURCES.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\dependency_links.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\entry_points.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\not-zip-safe -> build\bdist.win-amd64\egg\EGG-INFO
copying sherpa_onnx.egg-info\top_level.txt -> build\bdist.win-amd64\egg\EGG-INFO
writing build\bdist.win-amd64\egg\EGG-INFO\native_libs.txt
creating 'dist\sherpa_onnx-1.9.24-py3.10-win-amd64.egg' and adding 'build\bdist.win-amd64\egg' to it
removing 'build\bdist.win-amd64\egg' (and everything under it)
Processing sherpa_onnx-1.9.24-py3.10-win-amd64.egg
creating c:\users\t\desktop\code\ai\stella\venv\lib\site-packages\sherpa_onnx-1.9.24-py3.10-win-amd64.egg
Extracting sherpa_onnx-1.9.24-py3.10-win-amd64.egg to c:\users\t\desktop\code\ai\stella\venv\lib\site-packages
Adding sherpa-onnx 1.9.24 to easy-install.pth file
Installing sherpa-onnx-cli-script.py script to C:\Users\T\Desktop\Code\ai\stella\venv\Scripts
Installing sherpa-onnx-cli.exe script to C:\Users\T\Desktop\Code\ai\stella\venv\Scripts

Installed c:\users\t\desktop\code\ai\stella\venv\lib\site-packages\sherpa_onnx-1.9.24-py3.10-win-amd64.egg
Processing dependencies for sherpa-onnx==1.9.24
Finished processing dependencies for sherpa-onnx==1.9.24
@tn-17 tn-17 changed the title Issue with building sherpa-onnx gpu on window with python setup.py install Issue with building and running sherpa-onnx gpu on window May 15, 2024
@csukuangfj
Copy link
Collaborator

I installed onnxruntime-gpu specifically for CUDA 12.x following the instructions from

That won't affect the onnxruntime used in sherpa-onnx.

Could you try CUDA 11.8 since we are using onnxruntime 1.17.1 in sherpa-onnx ?

@tn-17 tn-17 changed the title Issue with building and running sherpa-onnx gpu on window Issue with building and running sherpa-onnx gpu on Windows May 15, 2024
@tn-17
Copy link
Author

tn-17 commented May 15, 2024

I have uninstalled CUDA 12.4 and installed CUDA 11.8.

image

Then I used python setup.py install again to rebuild and install sherpa-onnx for Nvidia GPU.

Then I ran the offline-tts-play.py example again. This got past the onnxruntime_providers_cuda.dll error. However, a new error appeared.

Could not locate cublasLt64_12.dll. Please make sure it is in your library path!

After some google searching, I think this means that I need to update CUDA to a newer version?

python piper_stream_example.py --vits-model=./en_US-libritts_r-medium.onnx --vits-tokens=./tokens.txt --vits-data-dir=./espeak-ng-data --output-filename=./test.wav --provider cuda --debug True 'This is a test'
Namespace(vits_model='./en_US-libritts_r-medium.onnx', vits_lexicon='', vits_tokens='./tokens.txt', vits_data_dir='./espeak-ng-data', vits_dict_dir='', tts_rule_fsts='', output_filename='./test.wav', sid=0, debug=True, provider='cuda', num_threads=1, speed=1.0, text='This is a test')
2024-05-15 00:12:24,597 INFO [piper_stream_example.py:320] Loading model ...
2024-05-15 00:12:26.0113635 [W:onnxruntime:, transformer_memcpy.cc:74 onnxruntime::MemcpyTransformer::ApplyImpl] 28 Memcpy nodes are added to the graph torch_jit for CUDAExecutionProvider. It might have negative impact on performance (including unable to run CUDA graph). Set session_options.log_severity_level=1 to see the detail logs before this message.
2024-05-15 00:12:26.0302070 [W:onnxruntime:, session_state.cc:1166 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Some nodes were not assigned to the preferred execution providers which may or may not have an negative impact on performance. e.g. ORT explicitly assigns shape related ops to CPU to improve perf.  
2024-05-15 00:12:26.0345325 [W:onnxruntime:, session_state.cc:1168 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Rerunning with verbose output on a non-minimal build will show node assignments.
Could not locate cublasLt64_12.dll. Please make sure it is in your library path!

@tn-17
Copy link
Author

tn-17 commented May 15, 2024

CUDA 11.8 contains cublasLt64_11.dll so I uninstalled 11.8 and installed 12.2.

I did not rebuild and reinstall sherpa-onnx for gpu.

I tried running the offline-tts-play.py example and encountered the onnxruntime_providers_cuda.dll error again.

Next, I reinstalled 11.8, keeping 12.2 as well since it is possible to have multiple installations.

I updated the path back to 11.8.

I retried the example py again and it got further. This time, it produced an error about zlibwapi.dll.

I got zlibwapi.dll from http://www.winimage.com/zLibDll/ as per NVIDIA Cuda and CUDNN installation instructions.

  • this is version 1.2.3
python piper_stream_example.py --vits-model=./en_US-libritts_r-medium.onnx --vits-tokens=./tokens.txt --vits-data-dir=./espeak-ng-data --output-filename=./test.wav --provider cuda --debug True 'This is a test'
Namespace(vits_model='./en_US-libritts_r-medium.onnx', vits_lexicon='', vits_tokens='./tokens.txt', vits_data_dir='./espeak-ng-data', vits_dict_dir='', tts_rule_fsts='', output_filename='./test.wav', sid=0, debug=True, provider='cuda', num_threads=1, speed=1.0, text='This is a test')
2024-05-15 00:32:10,848 INFO [piper_stream_example.py:320] Loading model ...
2024-05-15 00:32:11.9375753 [W:onnxruntime:, transformer_memcpy.cc:74 onnxruntime::MemcpyTransformer::ApplyImpl] 28 Memcpy nodes are added to the graph torch_jit for CUDAExecutionProvider. It might have negative impact on performance (including unable to run CUDA graph). Set session_options.log_severity_level=1 to see the detail logs before this message.
2024-05-15 00:32:11.9569339 [W:onnxruntime:, session_state.cc:1166 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Some nodes were not assigned to the preferred execution providers which may or may not have an negative impact on performance. e.g. ORT explicitly assigns shape related ops to CPU to improve perf.
2024-05-15 00:32:11.9607816 [W:onnxruntime:, session_state.cc:1168 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Rerunning with verbose output on a non-minimal build will show node assignments.
C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\sherpa-onnx\csrc\offline-tts-vits-model.cc:Init:79 ---vits model---
model_type=vits
comment=piper
has_espeak=1
language=English
voice=en-us
n_speakers=904
sample_rate=22050
----------input names----------
0 input
1 input_lengths
2 scales
3 sid
----------output names----------
0 output


2024-05-15 00:32:12,393 INFO [piper_stream_example.py:322] Loading model done.
2024-05-15 00:32:12,394 INFO [piper_stream_example.py:330] Start generating ...
C:\Users\T\Desktop\Code\ai\stella\sherpa-onnx\sherpa-onnx/csrc/offline-tts-vits-impl.h:Generate:165 Raw text: This is a test
Could not load library zlibwapi.dll. Error code 193. Please verify that the library is built correctly for your processor architecture (32-bit, 64-bit)
(venv)

@tn-17
Copy link
Author

tn-17 commented May 15, 2024

I was using the precomplied ddls for 32 bit. I downloaded the correct 64 bit ones from http://www.winimage.com/zLibDll/ and now there are no errors when running the offline-tts-play.py example.

Thank you for help! @csukuangfj

By the way, is there a performance issue with onnxruntime gpu?

I am finding that cpu is faster than gpu when measuring the "time in seconds to receive the first message" for generating the tts audio.

My gpu is RTX 3090. My cpu is i9-14900k.

2024-05-15 00:50:46.3377129 [W:onnxruntime:, transformer_memcpy.cc:74 onnxruntime::MemcpyTransformer::ApplyImpl] 28 Memcpy nodes are added to the graph torch_jit for CUDAExecutionProvider. It might have negative impact on performance (including unable to run CUDA graph). Set session_options.log_severity_level=1 to see the detail logs before this message.
2024-05-15 00:50:46.3555852 [W:onnxruntime:, session_state.cc:1166 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Some nodes were not assigned to the preferred execution providers which may or may not have an negative impact on performance. e.g. ORT explicitly assigns shape related ops to CPU to improve perf.
2024-05-15 00:50:46.3603249 [W:onnxruntime:, session_state.cc:1168 onnxruntime::VerifyEachNodeIsAssignedToAnEp] Rerunning with verbose output on a non-minimal build will show node assignments.

@tn-17 tn-17 closed this as completed May 15, 2024
@csukuangfj
Copy link
Collaborator

Glad to hear that you finally managed to run sherpa-onnx with GPU on Windows.

I am finding that cpu is faster than gpu when measuring the "time in seconds to receive the first message" for generating the tts audio.

GPU needs warmup also the advantage of GPU is parallel processing.

Moving data between CPU and GPU also takes time. In other words, GPU is not necessarily faster than CPU if you want to synthesize a single utterance.

@SolomonLeon
Copy link

Since www.winimage.com is unreachable now, I just upload the dll here zlib123dllx64.zip (Downloaded from http://www.winimage.com/zLibDll/zlib123dllx64.zip).

You can try placing the zlibwapi.dll file into the CUDA directory (such as C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\bin).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants