Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Failed to load dag data #41

Open
evergreen3472 opened this Issue · 33 comments

6 participants

@evergreen3472

After the setup, got this message when clicking on home button or load the root page. It's fine to see the demo data, small and large.

@billonahill
Collaborator

When you start up your pig job, there should be a line that says that the Ambrose web server is starting on a given port. Do you see that entry and is that the URL you're browsing to? If so, are their any other errors shown in the log?

@evergreen3472

Yes, it shows -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600. There is no errors from running pig script. Errors from web server shows "GET /dag HTTP/1.1" 404 -

@sagemintblue
Collaborator

Hi @evergreen3472. Could you please try urls of the form "/web/" e.g. "http://localhost:8080/web/dag" and let us know if you get anything beyond a 404?

Also, it might be helpful to note what platform, browser you're using.

Thanks!

@billonahill
Collaborator
@evergreen3472

If I am correct, the web server is started by ambrose-demo not by pig-ambrose? After fired ambrose-demo, we can get demo data from http://localhost:8080/index.html?localdata=small. A clicking on "Home" will get us this error.

@sagemintblue
Collaborator

The pig-ambrose command invokes pig with ambrose plugin. The plugin starts a web server on the configured port. If the plugin is inactive the server won't start. This is what @billonahill was getting at in his last post.

@evergreen3472

So, in my case, the web server was not started. I am using pig 0.8.1-cdh3u1. Will this be the issue?

@sagemintblue
Collaborator

https://github.com/twitter/ambrose/blob/master/pig/README.md

"""
The Ambrose Pig integration requires a number of patches that are committed on the Pig trunk and scheduled for release in Pig 0.11.0. Hence, the Ambrose distribution references a Pig 0.11.0-SNAPSHOT build. Note that running the ./bin/pig-ambrose script will result in the script being executed with the Pig 0.11.0-SNAPSHOT runtime.
"""

Please confirm that when you start pig-ambrose, you're running the bundled pig 0.11.0-SNAPSHOT.

@evergreen3472

Actually, pig-ambrose is invoking pig 0.8.1 that is found from the path. How can I force it to use pig 0.11.0-SNAPSHOT with pig 0.8.1 still available to users?

@SarahMohamed

I have the same problem, @billonahill Although I see "Starting ambrose web server on port 8080. Browse to
http://localhost:8080/to see job progress." in terminal, but when I click the link I got "cannot connect to localhost:8080" . Same as when I browse to "http://localhost:8080/web/dag" .

Also I got this error, but execution completes normally
Exception in thread "Thread-1" java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
at java.lang.ClassLoader.checkCerts(ClassLoader.java:787)
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:502)
at java.lang.ClassLoader.defineClass(ClassLoader.java:628)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
at org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
at org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:109)
at org.eclipse.jetty.webapp.WebAppContext.(WebAppContext.java:183)
at com.twitter.ambrose.server.ScriptStatusServer.run(ScriptStatusServer.java:119)
at java.lang.Thread.run(Thread.java:679)

@billonahill
Collaborator

@evergreen3472 the start-up scripts with Pig 0.8 don't allow the ability to override with a newer version of Pig. See #25. You'll need to upgrade to Pig 0.9 or 0.10.

@SarahMohamed you're issue is different. Would you mind filing a separate ticket please? It has to do with your Java security settings it seems and the classes that are being loaded in the servlet package. Try unsetting CLASSPATH or PIG_CLASSPATH if they're currently set.

@evergreen3472

We have tried some higher versions. There are some comparability issues with CDH3u1 which make us stay with 0.8.

@ami07

Hi
I have checked this issue and issue #25. Yet, I am still unable to get the UI to start. I tried using pig 0.10.0, pig 0.10.1 and the version in the trunk and generated the jar files on my machine.

The first few lines look as follows:
PIG_CLASSPATH=/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar:
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600
2013-01-06 12:35:39,608 [main] INFO org.apache.pig.Main - Apache Pig version 0.11.0-SNAPSHOT (r1397657) compiled Oct 12 2012, 18:02:28

I tried to change the port number but it did not make any difference.

When I use , I get:

Find hadoop at /home/xxx/hadoop-1.0.1/bin/hadoop
dry run:
HADOOP_CLASSPATH: /home/xxx/workspace/trunk/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar::/home/xxx/workspace/trunk/bin/../build/ivy/lib/Pig/jython-standalone-2.5.2.jar:/home/xxx/workspace/trunk/bin/../build/ivy/lib/Pig/jruby-complete-1.6.7.jar:/home/xxx/workspace/trunk/bin/../pig-withouthadoop.jar:
HADOOP_OPTS: -Xmx1000m -Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/xxx/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600 -Dpig.log.dir=/home/xxx/workspace/trunk/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/home/xxx/workspace/trunk/bin/..
/home/xxx/hadoop-1.0.1/bin/hadoop jar /home/xxx/workspace/trunk/bin/../pig-withouthadoop.jar -f pigTest_L3.pig

After finishing running pig, these lines are printed, so I am sure pig 0.11.0-SNAPSHOT is the one being run!

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.1 0.11.0-SNAPSHOT iman 2013-01-06 12:35:41 2013-01-06 12:36:08 HASH_JOIN,GROUP_BY

Additionally, when pig 0.11.0-SNAPSHOT is executed, the mapreduce job fails at the hadoop setup task with the below error (please note that running pig 0.10.x with the same pig script does not return any errors.. the jobs execute fine and produces result at the end)

Error: java.lang.ClassNotFoundException: org.joda.time.DateTime
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2291)
at java.lang.Class.getDeclaredField(Class.java:1880)
at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1610)
at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:52)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:425)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.(ObjectStreamClass.java:413)
at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:310)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:547)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1582)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1582)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
at java.util.LinkedList.readObject(LinkedList.java:964)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:53)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getStores(PigOutputFormat.java:218)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:245)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
at org.apache.hadoop.mapred.Task.initialize(Task.java:515)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:353)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

I would really appreciate your help to get pig run with ambrose.

@billonahill
Collaborator
@ami07

Thanks Bill for your reply.
The pig script I am using is from PigMix (L3).
Yes, it works fine with the pig version in the trunk (downloaded couple of months ago).

When I replace the pig jar in the classpath of the jar file generated from ambrose. The execution of the MapReduce jobs in the query fails with the same trace I listed in my previous comment.

I will try to use your patches with the pig version that I have from the trunk. I think I will also need to change the ambrose script to use the jar file I create instead of the one generated with ambrose, right?

@billonahill
Collaborator
@ami07

I pointed the pig-ambrose script to use the pig jar built from the trunk. The pig script executes fine with no errors. However, the UI would not start. I am also not seeing this line in the log (as you mentioned in the other issue):
12/06/28 16:37:25 INFO server.ScriptStatusServer: Starting ambrose web server on port 8080. Browse to http://localhost:8080/web to see job progress.

I am still not sure how to get ambrose to start.

@billonahill
Collaborator
@ami07

In all the cases I was trying to get ambrose to run, I am using either the pig jar that comes with ambrose, which is 0.11-SNAPSHOT or the pig.jar that is generated from the trunk, which is 0.12.0-SNAPSHOT. In all these cases, the UI did not start.

These are the lines in the log that shows that I am using the pig.jar that comes with ambrose:
2013-01-06 12:35:39,608 [main] INFO org.apache.pig.Main - Apache Pig version 0.11.0-SNAPSHOT (r1397657) compiled Oct 12 2012, 18:02:28

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.1 0.11.0-SNAPSHOT iman 2013-01-06 12:35:41 2013-01-06 12:36:08 HASH_JOIN,GROUP_BY

In the second case, these are the lines in the log that shows that I am using the pig.jar generated from the trunk:
2013-01-08 13:02:03,712 [main] INFO org.apache.pig.Main - Apache Pig version 0.12.0-SNAPSHOT (r1430288) compiled Jan 08 2013, 09:12:57

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.1 0.12.0-SNAPSHOT iman 2013-01-08 13:02:05 2013-01-08 13:04:49 HASH_JOIN,GROUP_BY

Is there any advice that I can use to debug the code?

@sagemintblue
Collaborator
@ami07

This is the output when I use the -secretDebugCmd

PIG_CLASSPATH=/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar:
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600
Find hadoop at /home/iman/installs/hadoop-1.0.1/bin/hadoop
dry run:
HADOOP_CLASSPATH: /home/iman/workspace/trunk/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar::/home/iman/workspace/trunk/bin/../build/ivy/lib/Pig/jython-standalone-2.5.2.jar:/home/iman/workspace/trunk/bin/../build/ivy/lib/Pig/jruby-complete-1.6.7.jar:/home/iman/workspace/trunk/bin/../pig-withouthadoop.jar:
HADOOP_OPTS: -Xmx1000m -Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600 -Dpig.log.dir=/home/iman/workspace/trunk/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/home/iman/workspace/trunk/bin/..
/home/iman/installs/hadoop-1.0.1/bin/hadoop jar /home/iman/workspace/trunk/bin/../pig-withouthadoop.jar -f /home/iman/Documents/pigMix/pigTest_L3.pig

@sagemintblue
Collaborator
@ami07

I will try an earlier version of hadoop. However, Pig 0.10 + returns a storage error with earlier versions.

@ami07

I tried an earlier version of hadoop: 0.20.x but I am still unable to get it to work

@ami07

Hello,
I thought I would send an update about how I fixed the problem. I got the UI starting by removing $HADOOP_HOME/bin from $PATH and adding it to the PIG_CLASSPATH in the pig-ambrose script.

I still cannot figure why this problem was happening, though.
Thanks
P.S. I am using the pig.jar in the trunk because my pigMix jobs fail when I use the pig.jar generated from ambrose.

@sagemintblue
Collaborator

@ami07 thanks for sharing your workaround with us! Could you also let us know what the final output of pig-ambrose -secretDebugCmd is so we know the position in the classpath sequence where $HADOOP_HOME/bin ended up?

Looking forward to hearing any more bugs / feature requests you might have for Ambrose.

@ami07

This is the output when using -secretDebugCmd

PIG_CLASSPATH=/home/iman/workspace/trunk/pig.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar::/home/iman/installs/hadoop-1.0.1/conf
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600
Cannot find local hadoop installation, using bundled hadoop 20.2
dry run:
/usr/lib/jvm/java-6-sun/bin/java -Xmx1000m -Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600 -Dpig.log.dir=/home/iman/workspace/trunk/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/home/iman/workspace/trunk/bin/.. -classpath /home/iman/workspace/trunk/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/iman/workspace/trunk/pig.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/guava-12.0.1.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-pig-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-simple-1.6.4.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/pig-0.11.0-20121012.180544-296.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/ambrose-common-0.1.0-SNAPSHOT.jar:/home/iman/ambrose/ambrose-pig-0.1.0-SNAPSHOT/lib/slf4j-api-1.6.4.jar::/home/iman/installs/hadoop-1.0.1/conf:/home/iman/workspace/trunk/bin/../build/ivy/lib/Pig/jython-standalone-2.5.2.jar:/home/iman/workspace/trunk/bin/../build/ivy/lib/Pig/jruby-complete-1.6.7.jar:/home/iman/workspace/trunk/bin/../pig.jar org.apache.pig.Main

@marutmisra08

Hi All,

I am facing the same issue.Demo is running fine.
Below is the log for the same.

mmisra@inpunpc310396:~/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT$ export AMBROSE_PORT=4567
mmisra@inpunpc310396:~/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT$ ./bin/pig-ambrose -f /home/mmisra/pig_demo/ambrose_demo.pig
PIG_CLASSPATH=/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/slf4j-api-1.7.5.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/jackson-core-2.1.1.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/jackson-databind-2.1.1.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/ambrose-common-0.2.9-SNAPSHOT.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/pig-0.11.1.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/guava-14.0.1.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/jetty-util-6.1.25.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/ambrose-pig-0.2.9-SNAPSHOT.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/jetty-6.1.25.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/jackson-annotations-2.1.1.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/slf4j-simple-1.7.5.jar:/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/servlet-api-2.5-20081211.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/conf/
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/guava-14.0.1.jar -Dambrose.port=4567 -Dambrose.post.script.sleep.seconds=600
2013-11-18 19:54:41,713 [main] INFO org.apache.pig.Main - Apache Pig version 0.11.1 (r1462419) compiled Mar 29 2013, 02:50:12
2013-11-18 19:54:41,713 [main] INFO org.apache.pig.Main - Logging error messages to: /home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/pig_1384784681711.log
2013-11-18 19:54:42,197 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/mmisra/.pigbootup not found
2013-11-18 19:54:42,306 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://INPUNPC310407:9000
2013-11-18 19:54:42,422 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to map-reduce job tracker at: INPUNPC310407:9001
2013-11-18 19:54:43,181 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: COGROUP,FILTER
2013-11-18 19:54:43,298 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2013-11-18 19:54:43,321 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2013-11-18 19:54:43,321 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2013-11-18 19:54:43,374 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2013-11-18 19:54:43,386 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2013-11-18 19:54:43,388 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Using reducer estimator: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
2013-11-18 19:54:43,392 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator - BytesPerReducer=1000000000 maxReducers=999 totalInputFileSize=128
2013-11-18 19:54:43,392 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting Parallelism to 1
2013-11-18 19:54:43,639 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - creating jar file Job1599178825210753644.jar
2013-11-18 19:54:45,860 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - jar file Job1599178825210753644.jar created
2013-11-18 19:54:45,873 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2013-11-18 19:54:45,879 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2013-11-18 19:54:45,879 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2013-11-18 19:54:45,879 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Setting key [pig.schematuple.classes] with classes to deserialize []
2013-11-18 19:54:45,940 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2013-11-18 19:54:46,226 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2013-11-18 19:54:46,226 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2013-11-18 19:54:46,236 [JobControl] WARN org.apache.hadoop.io.compress.snappy.LoadSnappy - Snappy native library is available
2013-11-18 19:54:46,237 [JobControl] INFO org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
2013-11-18 19:54:46,237 [JobControl] INFO org.apache.hadoop.io.compress.snappy.LoadSnappy - Snappy native library loaded
2013-11-18 19:54:46,240 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2013-11-18 19:54:46,244 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2013-11-18 19:54:46,244 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2013-11-18 19:54:46,245 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2013-11-18 19:54:46,441 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2013-11-18 19:54:47,158 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_201311180253_0006
2013-11-18 19:54:47,158 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases cogroup_join,cogroup_join_filter,first,flatten_co_group,second
2013-11-18 19:54:47,158 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: first[7,8],first[-1,-1],cogroup_join[9,15],second[8,9],second[-1,-1],cogroup_join[9,15] C: R: cogroup_join_filter[10,22],flatten_co_group[12,19]
2013-11-18 19:54:47,158 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - More information at: http://INPUNPC310407:50030/jobdetails.jsp?jobid=job_201311180253_0006
2013-11-18 19:54:50,681 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 50% complete
2013-11-18 19:54:57,714 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 66% complete
2013-11-18 19:55:01,762 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2013-11-18 19:55:01,764 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
0.20.2-cdh3u6 0.11.1 mmisra 2013-11-18 19:54:43 2013-11-18 19:55:01 COGROUP,FILTER

Success!

Job Stats (time in seconds):
JobId Maps Reduces MaxMapTime MinMapTIme AvgMapTime MedianMapTime MaxReduceTime MinReduceTime AvgReduceTime MedianReducetime Alias Feature Outputs
job_201311180253_0006 2 1 1 1 1 1 8 8 8 8 cogroup_join,cogroup_join_filter,first,flatten_co_group,second COGROUP hdfs://INPUNPC310407:9000/tmp/temp68044712/tmp-240041742,

Input(s):
Successfully read 6 records from: "/mmisra/input/pig/input_1.txt"
Successfully read 6 records from: "/mmisra/input/pig/input_2.txt"

Output(s):
Successfully stored 5 records (121 bytes) in: "hdfs://INPUNPC310407:9000/tmp/temp68044712/tmp-240041742"

Counters:
Total records written : 5
Total bytes written : 121
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_201311180253_0006

2013-11-18 19:55:01,775 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Success!
2013-11-18 19:55:01,778 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code.
2013-11-18 19:55:01,788 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2013-11-18 19:55:01,788 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
(1,2,3,4,5)
(1,2,5,67,8)
(1,2,4,6,8)
(1,3,5,7,8)
(1,3,5,6,8)

@sagemintblue
Collaborator
@marutmisra08

pig -version
Apache Pig version 0.12.0 (r1529718)
compiled Oct 07 2013, 12:20:14

But this is higher version and i think ambrose should work with it also ?
Env Variables:

export HADOOP_HOME=/usr/lib/yarn/hadoop-0.20.2-cdh3u6
export PATH=$PATH:$HADOOP_HOME/bin
export PIG_HOME=/usr/lib/yarn/pig-0.12.0
export PATH=$PATH:$PIG_HOME/bin
export PIG_CLASSPATH=/usr/lib/yarn/hadoop-0.20.2-cdh3u6/conf/
export PATH=$PATH:$PIG_HOME/bin
export PREFIX=/usr/lib/yarn/hue-1.2.0.0-cdh3u6

@sagemintblue
Collaborator
@marutmisra08

I make the changes below is the error now

mmisra@inpunpc310396:~/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT$ ./bin/pig-ambrose /home/mmisra/pig_demo/word_count.pig
PIG_CLASSPATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/lib/yarn/hive-0.7.1-cdh3u6/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/bin
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/guava-14.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:788)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:447)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.pig.tools.pigstats.PigStatsUtil.(PigStatsUtil.java:54)
at org.apache.pig.Main.run(Main.java:636)
at org.apache.pig.Main.main(Main.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:197)

Output of secretDebugCmd

PIG_CLASSPATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/lib/yarn/hive-0.7.1-cdh3u6/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/bin
PIG_OPTS=-Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/guava-14.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600
Find hadoop at /usr/bin/hadoop
dry run:
HADOOP_CLASSPATH: /etc/hbase:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../conf:/usr/lib/jvm/java-7-openjdk-amd64//lib/tools.jar:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/lib/yarn/hive-0.7.1-cdh3u6/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/pig-0.12.0/bin:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/bin:/usr/lib/yarn/hbase-0.90.6-cdh3u6/conf:/usr/lib/jvm/java-7-openjdk-amd64//lib/tools.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6:/usr/lib/yarn/hbase-0.90.6-cdh3u6/hbase-0.90.6-cdh3u6.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/hbase-0.90.6-cdh3u6-tests.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/activation-1.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/asm-3.2.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/avro-1.5.4.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/avro-ipc-1.5.4.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-cli-1.2.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-codec-1.4.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-el-1.0.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-httpclient-3.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-io-2.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-lang-2.5.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-logging-1.1.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/commons-net-1.4.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/core-3.1.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/guava-r06.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/guava-r09-jarjar.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/hadoop-core-0.20.2-cdh3u6.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jackson-core-asl-1.8.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jackson-xc-1.8.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jamon-runtime-2.3.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jasper-compiler-5.5.23.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jasper-runtime-5.5.23.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jaxb-api-2.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jersey-core-1.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jersey-json-1.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jersey-server-1.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jettison-1.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jetty-6.1.26.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jetty-util-6.1.26.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jruby-complete-1.6.0.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jsp-2.1-6.1.14.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jsp-api-2.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/jsr311-api-1.1.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/log4j-1.2.16.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/netty-3.2.4.Final.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/protobuf-java-2.3.0.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/servlet-api-2.5.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/slf4j-api-1.5.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/snappy-java-1.0.3.2.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/stax-api-1.0.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/thrift-0.2.0.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/velocity-1.5.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/xmlenc-0.52.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/lib/zookeeper-3.3.5-cdh3u6.jar:/etc/hadoop/conf:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/bin/../lib/zookeeper/:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/bin/../lib/zookeeper/lib/:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/conf/:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/conf:/usr/lib/jvm/java-7-openjdk-amd64//lib/tools.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/hadoop-core-0.20.2-cdh3u6.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/ant-contrib-1.0b3.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/asm-3.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/aspectjrt-1.6.5.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/aspectjtools-1.6.5.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-cli-1.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-codec-1.4.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-daemon-1.0.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-el-1.0.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-httpclient-3.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-io-2.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-lang-2.4.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-logging-1.0.4.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-logging-api-1.0.4.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/commons-net-3.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/core-3.1.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/guava-r09-jarjar.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/hadoop-fairscheduler-0.20.2-cdh3u6.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/hsqldb-1.8.0.10.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jackson-core-asl-1.5.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jasper-compiler-5.5.12.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jasper-runtime-5.5.12.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jersey-core-1.8.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jersey-json-1.8.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jersey-server-1.8.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jets3t-0.6.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jetty-6.1.26.cloudera.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jetty-servlet-tester-6.1.26.cloudera.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jetty-util-6.1.26.cloudera.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jsch-0.1.42.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/junit-4.5.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/kfs-0.2.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/log4j-1.2.15.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/mockito-all-1.8.2.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/oro-2.0.8.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/servlet-api-2.5-20081211.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/slf4j-api-1.4.3.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/xmlenc-0.52.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/yarn/hadoop-0.20.2-cdh3u6/lib/jsp-2.1/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/avro-1.7.4.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/jackson-mapper-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/json-simple-1.1.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/jython-standalone-2.5.2.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../lib/snappy-java-1.0.4.1.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/hbase-0.90.6-cdh3u6.jar:/usr/lib/yarn/hbase-0.90.6-cdh3u6/hbase-0.90.6-cdh3u6-tests.jar:/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../pig-0.11.0-cdh4.4.0-withouthadoop.jar:
HADOOP_OPTS: -Xmx1000m -Dpig.notification.listener=com.twitter.ambrose.pig.EmbeddedAmbrosePigProgressNotificationListener -Dpig.additional.jars=/home/mmisra/ambrose/pig/target/ambrose-pig-0.2.9-SNAPSHOT-bin/ambrose-pig-0.2.9-SNAPSHOT/lib/guava-14.0.1.jar -Dambrose.port=8080 -Dambrose.post.script.sleep.seconds=600 -Dpig.log.dir=/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/..
/usr/bin/hadoop jar /opt/cloudera/parcels/CDH-4.4.0-1.cdh4.4.0.p0.39/lib/pig/bin/../pig-0.11.0-cdh4.4.0-withouthadoop.jar

@marutmisra08

@sagemintblue solution provided by ami07 works fine for me
Earlier i was doing something wrong.

Thanks a lot for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.