Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compile issue on Linux #60

Closed
jasonmhead opened this issue Mar 25, 2023 · 1 comment
Closed

Compile issue on Linux #60

jasonmhead opened this issue Mar 25, 2023 · 1 comment

Comments

@jasonmhead
Copy link

Switched over to Linux, installed ninja, and have a compile issue perhaps.
Suggestions?

python chat.py


ChatRWKV v2 https://github.com/BlinkDL/ChatRWKV

English - cuda fp16 - /media/main/C/Users/Jason/Documents/machine_learning/language_ML/ChatRWKV/v2/prompt/default/English-2.py
Using /home/main/.cache/torch_extensions/py39_cu117 as PyTorch extensions root...
Detected CUDA files, patching ldflags
Emitting ninja build file /home/main/.cache/torch_extensions/py39_cu117/wkv_cuda/build.ninja...
Building extension module wkv_cuda...
Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N)
[1/1] c++ wrapper.o operators.cuda.o -shared -L/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda_cu -ltorch_cuda_cpp -ltorch -ltorch_python -L/home/main/miniconda3/envs/gptj/lib64 -lcudart -o wkv_cuda.so
FAILED: wkv_cuda.so 
c++ wrapper.o operators.cuda.o -shared -L/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda_cu -ltorch_cuda_cpp -ltorch -ltorch_python -L/home/main/miniconda3/envs/gptj/lib64 -lcudart -o wkv_cuda.so
/usr/bin/ld: cannot find -lcudart: No such file or directory
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
    subprocess.run(
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/subprocess.py", line 528, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/media/main/C/Users/Jason/Documents/machine_learning/language_ML/ChatRWKV/v2/chat.py", line 105, in <module>
    from rwkv.model import RWKV
  File "/media/main/C/Users/Jason/Documents/machine_learning/language_ML/ChatRWKV/v2/../rwkv_pip_package/src/rwkv/model.py", line 29, in <module>
    load(
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1284, in load
    return _jit_compile(
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1508, in _jit_compile
    _write_ninja_file_and_build_library(
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1623, in _write_ninja_file_and_build_library
    _run_ninja_build(
  File "/home/main/miniconda3/envs/gptj/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1916, in _run_ninja_build
    raise RuntimeError(message) from e
RuntimeError: Error building extension 'wkv_cuda'
@BlinkDL
Copy link
Owner

BlinkDL commented Mar 25, 2023

Please read https://github.com/BlinkDL/ChatRWKV README

Note RWKV_CUDA_ON will build a CUDA kernel ("pip install ninja" first).

How to build in Linux: set these and run v2/chat.py

export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH

@BlinkDL BlinkDL closed this as completed Mar 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants