New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[C++] HADOOP_HOME doesn't work to find libhdfs.so #24069
Comments
Krisztian Szucs / @kszucs: |
$ echo $HADOOP_HOME
/opt/hadoop/latest
$ echo $ARROW_LIBHDFS_DIR
According to https://arrow.apache.org/docs/python/filesystems.html, " IMHO it doesn't seem to make sense to try loading from $HADOOP_HOME/libhdfs.so. |
Krisztian Szucs / @kszucs: Could you please try to set either ARROW_LIBHDFS_DIR=$HADOOP_HOME/lib/native to see whether that resolves the issue or not? |
Jack Fan: IMO I should have a working |
Kouhei Sutou / @kou: I'll fix it. |
Krisztian Szucs / @kszucs: |
Wes McKinney / @wesm: |
I have my env variable setup correctly according to the pyarrow README
Use the following script to reproduce
With pyarrow version 0.15.1 it is fine.
However, version 0.16.0 will give error
Reporter: Jack Fan
Assignee: Kouhei Sutou / @kou
Related issues:
PRs and other links:
Note: This issue was originally created as ARROW-7841. Please see the migration documentation for further details.
The text was updated successfully, but these errors were encountered: