Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

build difficulties #35

Open
kharris opened this Issue · 10 comments

9 participants

@kharris

Hi, I'm having issues building the jar with ant. I am not much of a java developer but we are building a hadoop system to work with our cluster's Torque batch scheduler and need to compile our own version. So far, elephant-bird and hadoop are working but I can't get hadoop-lzo to build. I've tried passing the directory to the hadoop libraries to ant with

ant -noclasspath -lib $HOME/src/hadoop-0.20.2 -lib $HOME/hadoop-0.20.2/lib clean compile-native test tar

but it's failing with during compile-native with "[javah] Error: Class org.apache.hadoop.conf.Configuration could not be found." It looks like the hadoop libraries aren't being found. Full command output is below.

Am I missing some kind of environment variable that should be set? How do I specify that the hadoop classes are included during the build process?

Any help would be appreciated,
Kameron Harris
University of Vermont / onehappybird.com


Buildfile: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build.xml

clean:
[delete] Deleting directory /gpfs1/home/k/h/kharris/src/hadoop-lzo/build

ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
[get] To: /gpfs1/home/k/h/kharris/src/hadoop-lzo/ivy/ivy-2.0.0-rc2.jar
[get] Not modified - so not downloaded

ivy-init-dirs:
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/ivy
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/ivy/lib
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/ivy/report
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/ivy/maven

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /gpfs1/home/k/h/kharris/src/hadoop-lzo/ivy/ivysettings.xml

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: com.hadoop.gplcompression#Hadoop-GPL-Compression;working@bluemoon-user2.cluster
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found junit#junit;3.8.1 in maven2
[ivy:resolve] found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] :: resolution report :: resolve 209ms :: artifacts dl 6ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| common | 3 | 0 | 0 | 0 || 3 | 0 |
---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: com.hadoop.gplcompression#Hadoop-GPL-Compression
[ivy:retrieve] confs: [common]
[ivy:retrieve] 3 artifacts copied, 0 already retrieved (180kB/12ms)
No ivy:settings found for the default reference 'ivy.instance'. A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = /gpfs1/home/k/h/kharris/src/hadoop-lzo/ivy/ivysettings.xml

init:
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/classes
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/src
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/test
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/test/classes

compile-java:
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/build.xml:216: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 24 source files to /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/classes
[javac] warning: [options] bootstrap class path not set in conjunction with -source 1.6
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoLineRecordReader.java:31: warning: [deprecation] FileSplit in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.FileSplit;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:34: warning: [deprecation] FileSplit in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.FileSplit;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:35: warning: [deprecation] InputSplit in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.InputSplit;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:36: warning: [deprecation] JobConf in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.JobConf;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:37: warning: [deprecation] JobConfigurable in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.JobConfigurable;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:40: warning: [deprecation] TextInputFormat in org.apache.hadoop.mapred has been deprecated
[javac] import org.apache.hadoop.mapred.TextInputFormat;
[javac] ^
[javac] /gpfs1/home/k/h/kharris/src/hadoop-lzo/src/java/com/hadoop/mapred/DeprecatedLzoTextInputFormat.java:67: warning: [deprecation] TextInputFormat in org.apache.hadoop.mapred has been deprecated
[javac] public class DeprecatedLzoTextInputFormat extends TextInputFormat {
[javac] ^
[javac] 8 warnings

check-native-uptodate:

compile-native:
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/native/Linux-amd64-64/lib
[mkdir] Created dir: /gpfs1/home/k/h/kharris/src/hadoop-lzo/build/native/Linux-amd64-64/src/com/hadoop/compression/lzo
[javah] Error: Class org.apache.hadoop.conf.Configuration could not be found.

BUILD FAILED
/gpfs1/home/k/h/kharris/src/hadoop-lzo/build.xml:242: compilation failed

Total time: 5 seconds

@kharris

ant 1.8.2
current hadoop-lzo

@emtnezv

hi
I'm having the same issue, there are somebody that can help me.
I do not know if I'm not putting any parameter properly during the build process.

thanks for all
Enrique Martinez

@kharris

Okay, I figured it out a few days ago.
I had to edit the build.xml file to add the build classes to the compile-native target. It now is:

<javah classpath="${build.classes}"
       destdir="${build.native}/src/com/hadoop/compression/lzo"
       force="yes"
       verbose="yes">
  <class name="com.hadoop.compression.lzo.LzoCompressor" />
  <class name="com.hadoop.compression.lzo.LzoDecompressor" />
  <classpath refid="classpath"/>
</javah>
@kharris kharris closed this
@edsu

Is there any reason why this fix hasn't been committed back? I had the same problem: java1.7, ant1.8.2.

@gourav-sg

Thanks a ton, this worked like a charm

@ryandm

Yes, is there any reason this cannot be committed? java 1.7.0, ant 1.8.4 and I ran into the same problem.

@dvryaboy dvryaboy reopened this
@dvryaboy
Collaborator

Send a pull request :)

@dvryaboy
Collaborator

(friendlier explanation: the original issue was closed by the author, so it didn't show up as outstanding, so we didn't see it. Best way to get code into a project on github is to send a pull request.).

@rangadi rangadi was assigned
@skiold skiold referenced this issue from a commit in skiold/hadoop-lzo
@skiold skiold fix compile-native as per issue #35 44828ed
@abrock abrock referenced this issue
Merged

Update build.xml #59

@varunshaji

still not updated in main branch :(

@spullara

Pull request works for me but this still has to be applied whenever you want to build the project.

@killerwhile killerwhile referenced this issue from a commit
@abrock abrock Update build.xml
See #35

I got the exact same error message as kharris and fixed it exactle the way he fixed it.
38823ca
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.