Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue installing pydoop in Amazon emr-4.7.0 #214

Closed
ealtuna opened this issue Jun 7, 2016 · 6 comments
Closed

Issue installing pydoop in Amazon emr-4.7.0 #214

ealtuna opened this issue Jun 7, 2016 · 6 comments
Labels

Comments

@ealtuna
Copy link

ealtuna commented Jun 7, 2016

I have tried several approaches but it is not possible to install pydoop in Amazon emr-4.7.0. This is the command output:

pip install pydoop
You are using pip version 6.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Collecting pydoop
Downloading pydoop-1.2.0.tar.gz (956kB)
100% |████████████████████████████████| 958kB 442kB/s
Requirement already satisfied (use --upgrade to upgrade): setuptools>=3.3 in /usr/lib/python2.7/dist-packages (from pydoop)
Installing collected packages: pydoop
Running setup.py install for pydoop
Complete output from command /usr/bin/python2.7 -c "import setuptools, tokenize;file='/mnt/tmp/pip-build-X4tBci/pydoop/setup.py';exec(compile(getattr(tokenize, 'open', open)(file).read().replace('\r\n', '\n'), file, 'exec'))" install --record /tmp/pip-DwjTNG-record/install-record.txt --single-version-externally-managed --compile:
using setuptools version 12.2
running install
running build
hdfs core implementation: native
running build_py
creating build
creating build/lib
creating build/lib/pydoop
copying pydoop/test_support.py -> build/lib/pydoop
copying pydoop/init.py -> build/lib/pydoop
copying pydoop/pipes.py -> build/lib/pydoop
copying pydoop/hadoop_utils.py -> build/lib/pydoop
copying pydoop/jc.py -> build/lib/pydoop
copying pydoop/avrolib.py -> build/lib/pydoop
copying pydoop/test_utils.py -> build/lib/pydoop
copying pydoop/hadut.py -> build/lib/pydoop
copying pydoop/config.py -> build/lib/pydoop
copying pydoop/version.py -> build/lib/pydoop
creating build/lib/pydoop/mapreduce
copying pydoop/mapreduce/init.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/api.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/connections.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/pipes.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/simulator.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/string_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/binary_streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/jwritable_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/text_streams.py -> build/lib/pydoop/mapreduce
creating build/lib/pydoop/utils
copying pydoop/utils/init.py -> build/lib/pydoop/utils
copying pydoop/utils/jvm.py -> build/lib/pydoop/utils
copying pydoop/utils/serialize.py -> build/lib/pydoop/utils
copying pydoop/utils/conversion_tables.py -> build/lib/pydoop/utils
copying pydoop/utils/misc.py -> build/lib/pydoop/utils
creating build/lib/pydoop/hdfs
copying pydoop/hdfs/path.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/init.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/fs.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/common.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/file.py -> build/lib/pydoop/hdfs
creating build/lib/pydoop/app
copying pydoop/app/init.py -> build/lib/pydoop/app
copying pydoop/app/script_template.py -> build/lib/pydoop/app
copying pydoop/app/submit.py -> build/lib/pydoop/app
copying pydoop/app/script.py -> build/lib/pydoop/app
copying pydoop/app/main.py -> build/lib/pydoop/app
copying pydoop/app/argparse_types.py -> build/lib/pydoop/app
creating build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/pyjnius_loader.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/init.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/factory.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/jpype_loader.py -> build/lib/pydoop/utils/bridge
creating build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/init.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/api.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/impl.py -> build/lib/pydoop/hdfs/core
creating build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/hadoop.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/init.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/common.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/pydoop.properties -> build/lib/pydoop
running build_ext
building 'pydoop.sercore' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/serialize
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/protocol_codec.cc -o build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/SerialUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/StringUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -Wno-write-strings -O3
g++ -pthread -shared build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -L/usr/lib64 -lpython2.7 -o build/lib/pydoop/sercore.so
building 'pydoop.native_core_hdfs' extension
creating build/temp.linux-x86_64-2.7/src/libhdfsV2
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/common
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix
creating build/temp.linux-x86_64-2.7/src/native_core_hdfs
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/etc/alternatives/jre/include -Inative/jni_include -I/etc/alternatives/jre/lib -I/etc/alternatives/jre/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/exception.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/exception.o -Wno-write-strings
In file included from src/libhdfsV2/exception.c:19:0:
src/libhdfsV2/exception.h:39:17: fatal error: jni.h: No existe el fichero o el directorio
#include <jni.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1

----------------------------------------
Command "/usr/bin/python2.7 -c "import setuptools, tokenize;__file__='/mnt/tmp/pip-build-X4tBci/pydoop/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-DwjTNG-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /mnt/tmp/pip-build-X4tBci/pydoop
@ilveroluca
Copy link
Member

Hi. It's not finding the JDK. Make sure it's installed. If it is, make sure Pydoop can find it by setting the JAVA_HOME environment variable.

HTH

Luca

@ealtuna
Copy link
Author

ealtuna commented Jun 7, 2016

The JAVA_HOME is correctly set. I tried as a solution to install jdk 8.0 with:

`
wget --no-cookies --no-check-certificate --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com%2F; oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jdk/8-b132/jdk-8-linux-x64.rpm"

sudo yum -y install jdk-8-linux-x64.rpm

NR_OF_OPTIONS=$(echo 0 | alternatives --config java 2>/dev/null | grep 'There ' | awk '{print $3}' | tail -1)

sudo alternatives --install /usr/bin/java java /usr/java/default/bin/java 1

echo $(($NR_OF_OPTIONS + 1)) | sudo alternatives --config java

export JAVA_HOME=/usr/java/default/bin/java

export JRE_HOME=/usr/java/default/jre

export PATH=$PATH:/usr/java/default/bin
`

and after that I got:

`
sudo -E pip install pydoop
You are using pip version 6.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Collecting pydoop
Using cached pydoop-1.2.0.tar.gz
Requirement already satisfied (use --upgrade to upgrade): setuptools>=3.3 in /usr/lib/python2.7/dist-packages (from pydoop)
Installing collected packages: pydoop
Running setup.py install for pydoop
Complete output from command /usr/bin/python2.7 -c "import setuptools, tokenize;file='/mnt/tmp/pip-build-lr4JCM/pydoop/setup.py';exec(compile(getattr(tokenize, 'open', open)(file).read().replace('\r\n', '\n'), file, 'exec'))" install --record /tmp/pip-zfdtph-record/install-record.txt --single-version-externally-managed --compile:
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
using setuptools version 12.2
running install
running build
hdfs core implementation: native
running build_py
creating build
creating build/lib
creating build/lib/pydoop
copying pydoop/test_support.py -> build/lib/pydoop
copying pydoop/init.py -> build/lib/pydoop
copying pydoop/pipes.py -> build/lib/pydoop
copying pydoop/hadoop_utils.py -> build/lib/pydoop
copying pydoop/jc.py -> build/lib/pydoop
copying pydoop/avrolib.py -> build/lib/pydoop
copying pydoop/test_utils.py -> build/lib/pydoop
copying pydoop/hadut.py -> build/lib/pydoop
copying pydoop/config.py -> build/lib/pydoop
copying pydoop/version.py -> build/lib/pydoop
creating build/lib/pydoop/mapreduce
copying pydoop/mapreduce/init.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/api.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/connections.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/pipes.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/simulator.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/string_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/binary_streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/jwritable_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/text_streams.py -> build/lib/pydoop/mapreduce
creating build/lib/pydoop/utils
copying pydoop/utils/init.py -> build/lib/pydoop/utils
copying pydoop/utils/jvm.py -> build/lib/pydoop/utils
copying pydoop/utils/serialize.py -> build/lib/pydoop/utils
copying pydoop/utils/conversion_tables.py -> build/lib/pydoop/utils
copying pydoop/utils/misc.py -> build/lib/pydoop/utils
creating build/lib/pydoop/hdfs
copying pydoop/hdfs/path.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/init.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/fs.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/common.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/file.py -> build/lib/pydoop/hdfs
creating build/lib/pydoop/app
copying pydoop/app/init.py -> build/lib/pydoop/app
copying pydoop/app/script_template.py -> build/lib/pydoop/app
copying pydoop/app/submit.py -> build/lib/pydoop/app
copying pydoop/app/script.py -> build/lib/pydoop/app
copying pydoop/app/main.py -> build/lib/pydoop/app
copying pydoop/app/argparse_types.py -> build/lib/pydoop/app
creating build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/pyjnius_loader.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/init.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/factory.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/jpype_loader.py -> build/lib/pydoop/utils/bridge
creating build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/init.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/api.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/impl.py -> build/lib/pydoop/hdfs/core
creating build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/hadoop.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/init.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/common.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/pydoop.properties -> build/lib/pydoop
running build_ext
building 'pydoop.sercore' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/serialize
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/protocol_codec.cc -o build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/SerialUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/StringUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -Wno-write-strings -O3
g++ -pthread -shared build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -L/usr/lib64 -lpython2.7 -o build/lib/pydoop/sercore.so
building 'pydoop.native_core_hdfs' extension
creating build/temp.linux-x86_64-2.7/src/libhdfsV2
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/common
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix
creating build/temp.linux-x86_64-2.7/src/native_core_hdfs
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/exception.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/exception.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/expect.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/expect.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/native_mini_dfs.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/native_mini_dfs.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/hdfs.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/hdfs.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/jni_helper.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/jni_helper.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/common/htable.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/common/htable.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/os/posix/mutexes.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/mutexes.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/os/posix/thread.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/thread.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/os/posix/thread_local_storage.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/thread_local_storage.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/native_core_hdfs/hdfs_fs.cc -o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_fs.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/native_core_hdfs/hdfs_file.cc -o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_file.o -Wno-write-strings
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/java/default/include -Inative/jni_include -I/usr/java/default/lib -I/usr/java/default/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/native_core_hdfs/hdfs_module.cc -o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_module.o -Wno-write-strings
g++ -pthread -shared build/temp.linux-x86_64-2.7/src/libhdfsV2/exception.o build/temp.linux-x86_64-2.7/src/libhdfsV2/expect.o build/temp.linux-x86_64-2.7/src/libhdfsV2/native_mini_dfs.o build/temp.linux-x86_64-2.7/src/libhdfsV2/hdfs.o build/temp.linux-x86_64-2.7/src/libhdfsV2/jni_helper.o build/temp.linux-x86_64-2.7/src/libhdfsV2/common/htable.o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/mutexes.o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/thread.o build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix/thread_local_storage.o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_fs.o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_file.o build/temp.linux-x86_64-2.7/src/native_core_hdfs/hdfs_module.o -L/usr/java/default//Libraries -L/usr/java/default/jre/lib/amd64/server -L/usr/lib64 -ldl -ljvm -lpython2.7 -o build/lib/pydoop/native_core_hdfs.so -Wl,-rpath,/usr/java/default/jre/lib/amd64/server
hadoop_home: '/usr/lib/hadoop'
hadoop_version: '2.7.1-amzn-0'
java_home: '/usr/java/default/'
Building java code for hadoop-2.7.1-amzn-0
Compiling Java classes
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:22: error: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configuration;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:23: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.FloatWritable;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:24: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.NullWritable;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:25: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.InputFormat;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:26: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.InputSplit;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:27: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.JobConf;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:28: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.RecordReader;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:29: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.Reporter;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:30: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.TextInputFormat;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:31: error: package org.apache.hadoop.util does not exist
import org.apache.hadoop.util.ReflectionUtils;
^
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:43: error: cannot find symbol
implements InputFormat<FloatWritable, NullWritable> {
^
symbol: class InputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:43: error: cannot find symbol
implements InputFormat<FloatWritable, NullWritable> {
^
symbol: class FloatWritable
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:43: error: cannot find symbol
implements InputFormat<FloatWritable, NullWritable> {
^
symbol: class NullWritable
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:46: error: cannot find symbol
InputSplit genericSplit, JobConf job, Reporter reporter)
^
symbol: class InputSplit
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:46: error: cannot find symbol
InputSplit genericSplit, JobConf job, Reporter reporter)
^
symbol: class JobConf
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:46: error: cannot find symbol
InputSplit genericSplit, JobConf job, Reporter reporter)
^
symbol: class Reporter
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:45: error: cannot find symbol
public RecordReader<FloatWritable, NullWritable> getRecordReader(
^
symbol: class RecordReader
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:45: error: cannot find symbol
public RecordReader<FloatWritable, NullWritable> getRecordReader(
^
symbol: class FloatWritable
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:45: error: cannot find symbol
public RecordReader<FloatWritable, NullWritable> getRecordReader(
^
symbol: class NullWritable
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:51: error: cannot find symbol
public InputSplit[] getSplits(JobConf job, int numSplits) throws IOException {
^
symbol: class JobConf
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:51: error: cannot find symbol
public InputSplit[] getSplits(JobConf job, int numSplits) throws IOException {
^
symbol: class InputSplit
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:69: error: cannot find symbol
static class PipesDummyRecordReader implements RecordReader<FloatWritable, NullWritable> {
^
symbol: class RecordReader
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:69: error: cannot find symbol
static class PipesDummyRecordReader implements RecordReader<FloatWritable, NullWritable> {
^
symbol: class FloatWritable
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:69: error: cannot find symbol
static class PipesDummyRecordReader implements RecordReader<FloatWritable, NullWritable> {
^
symbol: class NullWritable
location: class PipesNonJavaInputFormat
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:72: error: cannot find symbol
public PipesDummyRecordReader(Configuration job, InputSplit split)
^
symbol: class Configuration
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:72: error: cannot find symbol
public PipesDummyRecordReader(Configuration job, InputSplit split)
^
symbol: class InputSplit
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:77: error: cannot find symbol
public FloatWritable createKey() {
^
symbol: class FloatWritable
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:81: error: cannot find symbol
public NullWritable createValue() {
^
symbol: class NullWritable
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:95: error: cannot find symbol
public synchronized boolean next(FloatWritable key, NullWritable value)
^
symbol: class FloatWritable
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java:95: error: cannot find symbol
public synchronized boolean next(FloatWritable key, NullWritable value)
^
symbol: class NullWritable
location: class PipesDummyRecordReader
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:23: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.FloatWritable;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:24: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.NullWritable;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:25: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:26: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.WritableComparable;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:27: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.JobConf;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:28: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.MapRunner;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:29: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.OutputCollector;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:30: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.RecordReader;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:31: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.Reporter;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:32: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.SkipBadRecords;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:33: error: package org.apache.hadoop.mapreduce does not exist
import org.apache.hadoop.mapreduce.MRJobConfig;
^
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:40: error: cannot find symbol
extends MapRunner<K1, V1, K2, V2> {
^
symbol: class MapRunner
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:38: error: cannot find symbol
class PipesMapRunner<K1 extends WritableComparable, V1 extends Writable,
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:38: error: cannot find symbol
class PipesMapRunner<K1 extends WritableComparable, V1 extends Writable,
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:39: error: cannot find symbol
K2 extends WritableComparable, V2 extends Writable>
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:39: error: cannot find symbol
K2 extends WritableComparable, V2 extends Writable>
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:41: error: cannot find symbol
private JobConf job;
^
symbol: class JobConf
location: class PipesMapRunner<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class PipesMapRunner
V1 declared in class PipesMapRunner
K2 declared in class PipesMapRunner
V2 declared in class PipesMapRunner
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:47: error: cannot find symbol
public void configure(JobConf job) {
^
symbol: class JobConf
location: class PipesMapRunner<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class PipesMapRunner
V1 declared in class PipesMapRunner
K2 declared in class PipesMapRunner
V2 declared in class PipesMapRunner
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:61: error: cannot find symbol
public void run(RecordReader<K1, V1> input, OutputCollector<K2, V2> output,
^
symbol: class RecordReader
location: class PipesMapRunner<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class PipesMapRunner
V1 declared in class PipesMapRunner
K2 declared in class PipesMapRunner
V2 declared in class PipesMapRunner
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:61: error: cannot find symbol
public void run(RecordReader<K1, V1> input, OutputCollector<K2, V2> output,
^
symbol: class OutputCollector
location: class PipesMapRunner<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class PipesMapRunner
V1 declared in class PipesMapRunner
K2 declared in class PipesMapRunner
V2 declared in class PipesMapRunner
src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java:62: error: cannot find symbol
Reporter reporter) throws IOException {
^
symbol: class Reporter
location: class PipesMapRunner<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class PipesMapRunner
V1 declared in class PipesMapRunner
K2 declared in class PipesMapRunner
V2 declared in class PipesMapRunner
src/v2/it/crs4/pydoop/pipes/LocalJobRunner.java:20: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.Path;
^
src/v2/it/crs4/pydoop/pipes/Application.java:33: error: package org.apache.commons.logging does not exist
import org.apache.commons.logging.Log;
^
src/v2/it/crs4/pydoop/pipes/Application.java:34: error: package org.apache.commons.logging does not exist
import org.apache.commons.logging.LogFactory;
^
src/v2/it/crs4/pydoop/pipes/Application.java:35: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FSDataOutputStream;
^
src/v2/it/crs4/pydoop/pipes/Application.java:36: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FileSystem;
^
src/v2/it/crs4/pydoop/pipes/Application.java:37: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FileUtil;
^
src/v2/it/crs4/pydoop/pipes/Application.java:38: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.Path;
^
src/v2/it/crs4/pydoop/pipes/Application.java:39: error: package org.apache.hadoop.fs.permission does not exist
import org.apache.hadoop.fs.permission.FsPermission;
^
src/v2/it/crs4/pydoop/pipes/Application.java:40: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.FloatWritable;
^
src/v2/it/crs4/pydoop/pipes/Application.java:41: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.NullWritable;
^
src/v2/it/crs4/pydoop/pipes/Application.java:42: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
src/v2/it/crs4/pydoop/pipes/Application.java:43: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.WritableComparable;
^
src/v2/it/crs4/pydoop/pipes/Application.java:44: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.JobConf;
^
src/v2/it/crs4/pydoop/pipes/Application.java:45: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.OutputCollector;
^
src/v2/it/crs4/pydoop/pipes/Application.java:46: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.RecordReader;
^
src/v2/it/crs4/pydoop/pipes/Application.java:47: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.Reporter;
^
src/v2/it/crs4/pydoop/pipes/Application.java:48: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.TaskAttemptID;
^
src/v2/it/crs4/pydoop/pipes/Application.java:49: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.TaskLog;
^
src/v2/it/crs4/pydoop/pipes/Application.java:51: error: package org.apache.hadoop.mapreduce does not exist
import org.apache.hadoop.mapreduce.MRJobConfig;
^
src/v2/it/crs4/pydoop/pipes/Application.java:52: error: package org.apache.hadoop.mapreduce.filecache does not exist
import org.apache.hadoop.mapreduce.filecache.DistributedCache;
^
src/v2/it/crs4/pydoop/pipes/Application.java:53: error: package org.apache.hadoop.mapreduce.security does not exist
import org.apache.hadoop.mapreduce.security.SecureShuffleUtils;
^
src/v2/it/crs4/pydoop/pipes/Application.java:54: error: package org.apache.hadoop.mapreduce.security does not exist
import org.apache.hadoop.mapreduce.security.TokenCache;
^
src/v2/it/crs4/pydoop/pipes/Application.java:55: error: package org.apache.hadoop.mapreduce.security.token does not exist
import org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier;
^
src/v2/it/crs4/pydoop/pipes/Application.java:56: error: package org.apache.hadoop.mapreduce.security.token does not exist
import org.apache.hadoop.mapreduce.security.token.JobTokenSecretManager;
^
src/v2/it/crs4/pydoop/pipes/Application.java:57: error: package org.apache.hadoop.security.token does not exist
import org.apache.hadoop.security.token.Token;
^
src/v2/it/crs4/pydoop/pipes/Application.java:58: error: package org.apache.hadoop.util does not exist
import org.apache.hadoop.util.ReflectionUtils;
^
src/v2/it/crs4/pydoop/pipes/Application.java:59: error: package org.apache.hadoop.util does not exist
import org.apache.hadoop.util.StringUtils;
^
src/v2/it/crs4/pydoop/pipes/Application.java:65: error: cannot find symbol
class Application<K1 extends WritableComparable, V1 extends Writable,
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/Application.java:65: error: cannot find symbol
class Application<K1 extends WritableComparable, V1 extends Writable,
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/Application.java:66: error: cannot find symbol
K2 extends WritableComparable, V2 extends Writable> {
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/Application.java:66: error: cannot find symbol
K2 extends WritableComparable, V2 extends Writable> {
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/Application.java:67: error: cannot find symbol
private static final Log LOG = LogFactory.getLog(Application.class.getName());
^
symbol: class Log
location: class Application<K1,V1,K2,V2>
where K1,V1,K2,V2 are type-variables:
K1 declared in class Application
V1 declared in class Application
K2 declared in class Application
V2 declared in class Application
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:25: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.FloatWritable;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:26: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.NullWritable;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:27: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:28: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.WritableComparable;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:29: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.Counters;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:30: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.OutputCollector;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:31: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.RecordReader;
^
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:32: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.Reporter;
^
src/v2/it/crs4/pydoop/pipes/UpwardProtocol.java:22: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
src/v2/it/crs4/pydoop/pipes/UpwardProtocol.java:23: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.WritableComparable;
^
src/v2/it/crs4/pydoop/pipes/UpwardProtocol.java:29: error: cannot find symbol
interface UpwardProtocol<K extends WritableComparable, V extends Writable> {
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/UpwardProtocol.java:29: error: cannot find symbol
interface UpwardProtocol<K extends WritableComparable, V extends Writable> {
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:37: error: cannot find symbol
class OutputHandler<K extends WritableComparable,
^
symbol: class WritableComparable
src/v2/it/crs4/pydoop/pipes/OutputHandler.java:38: error: cannot find symbol
V extends Writable>
^
symbol: class Writable
src/v2/it/crs4/pydoop/pipes/DownwardProtocol.java:23: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Writable;
^
src/v2/it/crs4/pydoop/pipes/DownwardProtocol.java:24: error: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.WritableComparable;
^
src/v2/it/crs4/pydoop/pipes/DownwardProtocol.java:25: error: package org.apache.hadoop.mapred does not exist
import org.apache.hadoop.mapred.InputSplit;
^
error: Error compiling java component. Command: javac -classpath /usr/lib/hadoop/lib/native:/usr/lib/hadoop/etc/hadoop:build/lib/pydoop/avro-mapred-1.7.4-hadoop2.jar -d 'build/temp.linux-x86_64-2.7/pipes' src/v2/it/crs4/pydoop/pipes/PipesNonJavaInputFormat.java src/v2/it/crs4/pydoop/pipes/PipesMapRunner.java src/v2/it/crs4/pydoop/pipes/LocalJobRunner.java src/v2/it/crs4/pydoop/pipes/Application.java src/v2/it/crs4/pydoop/pipes/DownwardProtocol.java src/v2/it/crs4/pydoop/pipes/PipesReducer.java src/v2/it/crs4/pydoop/pipes/UpwardProtocol.java src/v2/it/crs4/pydoop/pipes/Submitter.java src/v2/it/crs4/pydoop/pipes/BinaryProtocol.java src/v2/it/crs4/pydoop/pipes/OutputHandler.java src/v2/it/crs4/pydoop/pipes/PipesPartitioner.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputKeyValueBridge.java src/v2/it/crs4/pydoop/mapreduce/pipes/PipesNonJavaInputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueRecordReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyRecordWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueInputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/TaskLog.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueInputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyOutputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeValueReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/Application.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputBridgeBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java src/v2/it/crs4/pydoop/mapreduce/pipes/DummyRecordReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordReaderBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeWriterBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyValueWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java src/v2/it/crs4/pydoop/mapreduce/pipes/UpwardProtocol.java src/v2/it/crs4/pydoop/mapreduce/pipes/Submitter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputBridgeBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeReaderBase.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueOutputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/TaskLogAppender.java src/v2/it/crs4/pydoop/mapreduce/pipes/BinaryProtocol.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueRecordWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputKeyValueBridge.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputValueBridge.java src/v2/it/crs4/pydoop/mapreduce/pipes/OutputHandler.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyRecordReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PipesPartitioner.java src/v2/it/crs4/pydoop/mapreduce/pipes/PipesMapper.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyValueReader.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputKeyBridge.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyInputFormat.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputKeyBridge.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeValueWriter.java src/v2/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputValueBridge.java src/v2/it/crs4/pydoop/NoSeparatorTextOutputFormat.java

----------------------------------------
Command "/usr/bin/python2.7 -c "import setuptools, tokenize;__file__='/mnt/tmp/pip-build-lr4JCM/pydoop/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-zfdtph-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /mnt/tmp/pip-build-lr4JCM/pydoop

`

@jerrypaytm
Copy link

so no one is able to solve this hadoop classpath issue?
clearly the issue is pydoop.hadoop_classpath() returns wrong results.
why not just run a subprocess to execute the hadoop classpath and just return the string?

@ilveroluca
Copy link
Member

Hi there. Unfortunately we're out of AWS credits. Why don't you try your solution and submit a PR if it works? We'd be happy to receive your contribution.

Luca

@jerrypaytm
Copy link

@ilveroluca The reason I'm trying to install pydoop on EMR is because of TensorflowOnSpark. Fortunately, it is no longer a requirement. So I don't need it anymore. :)

@simleo simleo added the EMR label Nov 22, 2017
@simleo
Copy link
Member

simleo commented Mar 22, 2018

Checking now with emr-5.12.0. JAVA_HOME is set to:
/etc/alternatives/jre
which points to:
/usr/lib/jvm/jre-1.8.0-openjdk.x86_64
which in turn points to:
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.36.amzn1.x86_64/jre

Pydoop needs to compile native code, so it needs the JDK home, i.e.:
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.36.amzn1.x86_64
Which is pointed to by:
/usr/lib/jvm/java-1.8.0-openjdk.x86_64
which in turn is pointed to by:
/etc/alternatives/java_sdk
so the following makes compilation move forward:
export JAVA_HOME=/etc/alternatives/java_sdk

This is not the end of the story, though, since the logic that detects Java dependencies is not EMR-aware yet, so the Java code won't compile.

I will update this ticket after fixing Java compilation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants