New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
can not run "from nvidia.dali.pipeline import Pipeline" #5
Comments
Hi, |
I found an error in the RPATH updating in bundle-wheel.sh that likely accounts for this. I have a fix pending. |
(It would still help to know the answers to @JanuszL 's questions so that I can attempt to reproduce the problem. Further: are you using virtualenv or Conda environments or any similar setup?) |
Hi, I got a similar problem. Here is the traceback:
The lib64 has been added to LD_LIBRARY_PATH and everything goes fine when I run pytorch so i think CUDA9.0 is properly installed.
I use conda environment BTW and this is what pip show nvidia-dali gives:
|
@JanuszL @cliffwoolley
I have search libdali.so in my conda env ,I found nothing.But when I compiled from DALI source,I got the libdali.so. |
@xxradon |
It's not called libdali.so in the installed binary package; it's libdali-*.so, where the * is a hash. Bundle-wheel.sh renames it when packaging the wheel. But because of some path errors, not all references to it are corrected properly. I have a fix for that, but the fix exposes another error that I'm still trying to sort out. |
I have test recent release editon,this problem has been fixed.Thank you. |
Actually this isn't fixed yet. :) Glad it worked for you regardless, but I'm reopening this until I push the proper fix. |
when I use the 3.1. Binary Installation step [In]( https://docs.nvidia.com/deeplearning/sdk/dali-install-guide/index.html ,everything is ok.But when I 'get started',in python ,run this command" from nvidia.dali.pipeline import Pipeline",I got this error:'ImportError: libdali.so: cannot open shared object file: No such file or directory
'
The text was updated successfully, but these errors were encountered: