Skip to content
This repository has been archived by the owner on Feb 12, 2022. It is now read-only.

Query a Secure HBase cluster through Phoenix #382

Closed
anilgupta84 opened this issue Aug 20, 2013 · 33 comments
Closed

Query a Secure HBase cluster through Phoenix #382

anilgupta84 opened this issue Aug 20, 2013 · 33 comments
Labels

Comments

@anilgupta84
Copy link

Hi James and Team,

I would like to use Phoenix for sql-like querying purpose. But, our Hadoop/HBase are secured. So, it seems like we need to enable users to login to a secure HBase cluster and query tables through Phoenix. I am willing to work with the community and provide the patch for adding support in Phoenix to connect to a secure cluster.

Thanks,
Anil Gupta
Software Engineer, Intuit, Inc.

@jtaylor-sfdc
Copy link
Contributor

On Thu, Aug 22, 2013 at 12:03 PM, anil gupta anilgupta84@gmail.com wrote:

Inline

On Thu, Aug 22, 2013 at 11:14 AM, Gary Helmling ghelmling@gmail.com wrote:

Hi Anil,

More info on k5start here: http://linux.die.net/man/1/k5start

What i already tried:
Run the kinit command before invoking sqlline.sh. It didnt work. As per
my analysis, it didnt work because when we create the HBase conf in Phoenix
its not using UserGroupInformation.setConfiguration() and User.login()
methods.

User.login() should not be necessary from the client side. This is called
in the daemon process startup (HMaster, HRegionServer, etc) to login those
processes from keytab files. I suppose you could call it from the client
side (though you would have to have matching configuration keys), but it
shouldn't be necessary if you've externally obtained credentials via kinit
or k5start.

I am curious to know whether you have been able to connect to a secure
HBase cluster with phoenix? If yes, it would be great if you can share how
to connect?

Unfortunately I haven't tested phoenix with a secure cluster, so I can't
verify that this works transparently. Maybe there is something else going
on that I'm missing. But these additional steps should not be necessary
strictly from the HBase client perspective.

I dont know much about k5start. Can you elaborate more on that? How you
would use it with Phoenix?

Yes, in java code we have to use UserGroupInformation.setConfiguration()
and User.login() method to connect to secure cluster.

This should not really be necessary. Does "hbase shell" work correctly
for you after doing a kinit? What does the hbase-site.xml present on your
client classpath look like?
Anil: Yes, it works fine after running kinit command. Even, i was
wondering about this after sending my last email. I need to see why hbase
shell works with kinit. :/

I think, i should ask this on hbase mailing list.

Thanks & Regards,
Anil Gupta

@saket-srivastava
Copy link

Hello Anil,

Please define following:

-Djava.security.auth.login.config + -Djava.security.krb5.realm + -Djava.security.krb5.kdc + -Djava.security.krb5.conf + update classpath to contain < HBase Conf + Hadoop Conf + Hadoop Common Jar + HBase Jar >

Have < phoenix-2.0-SNAPSHOT.20130805.jar > in your class path before any other phoenix jars you may be having.

Also ensure to run < kinit -k -t {your key tab} {your principal} >

Cheers,
Saket.

@ryang-sfdc
Copy link
Contributor

What about the hadoop auth jar too?

On Fri, Aug 23, 2013 at 1:56 AM, saket-srivastava
notifications@github.comwrote:

Hello Anil,

Please define following:

-Djava.security.auth.login.config + -Djava.security.krb5.realm +
-Djava.security.krb5.kdc + -Djava.security.krb5.conf + update classpath to
contain < HBase Conf + Hadoop Conf + Hadoop Common Jar + HBase Jar >

Have < phoenix-2.0-SNAPSHOT.20130805.jar > in your class path before any
other phoenix jars you may be having.

Also ensure to run < kinit -k -t {your key tab} {your principal} >

Cheers,
Saket.


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23151362
.

@saket-srivastava
Copy link

Hello Ron,

Yes - thanks for pointing this.

Cheers,
Saket.

On Fri, Aug 23, 2013 at 8:00 PM, Ron Yang notifications@github.com wrote:

What about the hadoop auth jar too?

On Fri, Aug 23, 2013 at 1:56 AM, saket-srivastava
notifications@github.comwrote:

Hello Anil,

Please define following:

-Djava.security.auth.login.config + -Djava.security.krb5.realm +
-Djava.security.krb5.kdc + -Djava.security.krb5.conf + update classpath
to
contain < HBase Conf + Hadoop Conf + Hadoop Common Jar + HBase Jar >

Have < phoenix-2.0-SNAPSHOT.20130805.jar > in your class path before any
other phoenix jars you may be having.

Also ensure to run < kinit -k -t {your key tab} {your principal} >

Cheers,
Saket.


Reply to this email directly or view it on GitHub<
https://github.com/forcedotcom/phoenix/issues/382#issuecomment-23151362>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23167285
.

@anilgupta84
Copy link
Author

Hi Saket and Ron,

Thanks for the details. I could not understand which auth jar you guys were talking about.

@saket-srivastava
Copy link

Hello Anil,

< hadoop-auth-2.0.0-cdh4.3.0.jar > - we use CDH 4.3.0 Hadoop release.

Cheers,
Saket.

On Mon, Aug 26, 2013 at 3:03 AM, anilgupta84 notifications@github.comwrote:

Hi Saket and Ron,

Thanks for the details. I could not understand which auth jar you guys
were talking about.


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23235990
.

@anilgupta84
Copy link
Author

Thanks Saket. I will try out your recommendation this week. Hopefully, i wont get too many hiccups.

@anilgupta84
Copy link
Author

Hi All,

As per your suggestion, I modified the sqlline.sh and added additional parameters in invocation of Phoenix. Following is my final command in sqlline.sh:
java -cp "/tmp/phoenix/phoenix-2.0.0.jar:.:$phoenix_client_jar:/etc/hbase/conf/:/etc/hadoop/conf/:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-2.0.0-cdh4.3.0.jar:/opt/cloudera/parcels/CDH/lib/hbase/hbase-0.94.6-cdh4.3.0-security.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-auth-2.0.0-cdh4.3.0.jar" -Djava.security.auth.login.config=/etc/hbase/conf/jaas.conf -Djava.security.krb5.realm=ABC.INTUIT.NET -Djava.security.krb5.kdc=abc01.intuit.net -Djava.security.krb5.conf=/etc/krb5.conf -Dlog4j.configuration=file:log4j.properties sqlline.SqlLine -d com.salesforce.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none -p none --color=true --fastConnect=false --silent=true
--isolation=TRANSACTION_READ_COMMITTED $sqlfile
But after doing the above, when i run sqlline.sh i get the following error on command-line:
org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Lorg/apache/hadoop/net/SocketInputWrapper;

Above error message is different from the error message i used to get before(so additional parameters made some impact). We are also using cdh4.3. It seems like we are hitting #286 . Is that correct?
Also, is there a way to get some detailed failure logs when we run sqlline.sh script?

Thanks,
Anil Gupta

@jtaylor-sfdc
Copy link
Contributor

Hi Anil,
What are you using for $phoenix_client_jar ? The only jar you need for
phoenix is /tmp/phoenix/phoenix-2.0.0.jar. The other jars you need are
which ever jars are required for the HBase client (i.e. the secure ones in
your case).

Thanks,
James

On Wed, Aug 28, 2013 at 6:02 PM, anilgupta84 notifications@github.comwrote:

Hi All,

As per your suggestion, I modified the sqlline.sh and added additional
parameters in invocation of Phoenix. Following is my final command in
sqlline.sh:
java -cp
"/tmp/phoenix/phoenix-2.0.0.jar:.:$phoenix_client_jar:/etc/hbase/conf/:/etc/hadoop/conf/:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-2.0.0-cdh4.3.0.jar:/opt/cloudera/parcels/CDH/lib/hbase/hbase-0.94.6-cdh4.3.0-security.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-auth-2.0.0-cdh4.3.0.jar"
-Djava.security.auth.login.config=/etc/hbase/conf/jaas.conf
-Djava.security.krb5.realm=ABC.INTUIT.NET -Djava.security.krb5.kdc=
abc01.intuit.net -Djava.security.krb5.conf=/etc/krb5.conf
-Dlog4j.configuration=file:log4j.properties sqlline.SqlLine -d
com.salesforce.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none -p
none --color=true --fastConnect=false --silent=true
--isolation=TRANSACTION_READ_COMMITTED $sqlfile
But after doing the above, when i run sqlline.sh i get the following error
on command-line:

org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Lorg/apache/hadoop/net/SocketInputWrapper;

Above error message is different from the error message i used to get
before(so additional parameters made some impact). We are also using
cdh4.3. It seems like we are hitting #286#286. Is that correct?
Also, is there a way to get some detailed failure logs when we run
sqlline.sh script?

Thanks,
Anil Gupta


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23460436
.

@anilgupta84
Copy link
Author

Hi James,

I removed $phoenix_client_jar from classpath. But, now i get following error when running sqlline.sh:
Exception in thread "main" java.lang.NoClassDefFoundError: sqlline/SqlLine
Caused by: java.lang.ClassNotFoundException: sqlline.SqlLine

Are you suggesting me that i should not use sqlline.sh to invoke phoenix? If yes, then how would i invoke Phoenix without sqlline.sh ?

Thanks for your guidance,
Anil

@jtaylor-sfdc
Copy link
Contributor

If you want to use SqlLine, you'll need to add the SqlLine jar and its
dependent jars to the class path as well: sqlline-1.1.2.jar and
jline-2.11.jar

Thanks,
James

On Wed, Aug 28, 2013 at 7:06 PM, anilgupta84 notifications@github.comwrote:

Hi James,

I removed $phoenix_client_jar from classpath. But, now i get following
error when running sqlline.sh:
Exception in thread "main" java.lang.NoClassDefFoundError: sqlline/SqlLine
Caused by: java.lang.ClassNotFoundException: sqlline.SqlLine

Are you suggesting me that i should not use sqlline.sh to invoke phoenix?
If yes, then how would i invoke Phoenix without sqlline.sh ?

Thanks for your guidance,
Anil


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23462409
.

@anilgupta84
Copy link
Author

Hi James,
I added the above mentioned jar and now i dont get any error while invoking sqlline.sh script. So, moved one more step forward. :)
But, when i try to run "!tables" command, i get "java.lang.IllegalArgumentException: No current connection"
I also tried "!list" command, i got "#0 closed "

I invoked sqlline.sh with following command: sqlline.sh
I tried to search in the documentation for some help on this issue but could not locate anything.

Thanks,
Anil

@jtaylor-sfdc
Copy link
Contributor

Hi Anil,
Which documentation did you check?
You need to tell sqlline where you're connecting to like this: sqlline localhost
This is documented on our home page in our Getting Started section here. It's the first thing you find if you search for "sqlline". If you invoke our sqlline.sh script and don't specify this, you'll get this message:

  Zookeeper not specified. 
  Usage: sqlline.sh <zookeeper> <optional_sql_file> 
  Example: 
   1. sqlline.sh localhost 
   2. sqlline.sh localhost ../examples/stock_symbol.sql

If you type "help" at the sqlline command prompt, you get a list of possible commands, one of which is "!connect", so that's another way.

Another option is using google. I searched for "Phoenix sqlline" and the first link was this one:

  phoenix/bin/sqlline.sh at master · forcedotcom/phoenix · GitHub
  https://github.com/forcedotcom/phoenix/blob/master/bin/sqlline.sh‎
  Jun 17, 2013 - then echo -e "Zookeeper not specified. \nUsage: sqlline.sh <zookeeper> <optional_sql_file> \nExample: \n 1. sqlline.sh localhost \n 2.

Thanks,
James

@anilgupta84
Copy link
Author

Hi James,

I did use sqlline.sh command. Sorry, In my last email, due to a typo i gave you wrong information while i mean to say that "I invoked sqlline.sh with following command: sqlline.sh ". I went through the documentation link that you suggested.

Thanks,
Anil

@anilgupta84
Copy link
Author

I just realized one thing.. Github suppressed my word which are put under angular brackets! This is bad and confusing. In the above message also, my sqlline invocation command is wrong due to this.
So, that was not a typo from me. :)
To be clear, i invoked sqlline.sh by: sqlline.sh zookeeper_quorum

@jtaylor-sfdc
Copy link
Contributor

Please confirm that you can use the HBase shell with the identical class path. Thanks,

James

@anilgupta84
Copy link
Author

Hi James,

Soon i'll get back to you with hbase shell information.
On a side note, I am wondering how does Saket connects. Does he connects
using Squirrel client?
Where can i find logs of phoenix when i invoke sqlline.sh? or How can i get
debug logs of sqlline.sh? It seems like the log4j.properties in bin
directory is only supposed to do logging for psql.sh.

Thanks,
Anil

On Thu, Aug 29, 2013 at 8:51 AM, James Taylor notifications@github.comwrote:

Please confirm that you can use the HBase shell with the identical class
path. Thanks,

James


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23500076
.

Thanks & Regards,
Anil Gupta

@mujtabachohan
Copy link
Contributor

sqlline.sh also uses log4j.properties. Just remove this properties file from sqline path and HBase connection log would be written to console.

@anilgupta84
Copy link
Author

Hi Mujtaba,

I deleted log4j.properties file from bin folder. Still, no logs on the
sysout. When i invoke sqlline. I only see one line on sysout:
org/apache/commons/logging/LogFactory

On a side note, i am connected Phoenix to a non-scure cluster. It connected
successfully. When i invoke sqlline. I only see one line on sysout:
36/36 (100%) Done

What does the above message means? With secure cluster, it seems like,
phoenix is failing to connect.

Thanks,
Anil Gupta

On Thu, Aug 29, 2013 at 10:08 AM, mujtabachohan notifications@github.comwrote:

sqlline.sh also uses log4j.properties. Just remove this properties file
from sqline path and HBase connection log would be written to console.


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23505948
.

Thanks & Regards,
Anil Gupta

@jtaylor-sfdc
Copy link
Contributor

Hi Anil,
Using the exact same class path you're using with trying to use sqlline,
are you able to connect to and use the HBase shell?
Thanks,
James

On Thu, Aug 29, 2013 at 3:30 PM, anilgupta84 notifications@github.comwrote:

Hi Mujtaba,

I deleted log4j.properties file from bin folder. Still, no logs on the
sysout. When i invoke sqlline. I only see one line on sysout:
org/apache/commons/logging/LogFactory

On a side note, i am connected Phoenix to a non-scure cluster. It
connected
successfully. When i invoke sqlline. I only see one line on sysout:
36/36 (100%) Done

What does the above message means? With secure cluster, it seems like,
phoenix is failing to connect.

Thanks,
Anil Gupta

On Thu, Aug 29, 2013 at 10:08 AM, mujtabachohan notifications@github.comwrote:

sqlline.sh also uses log4j.properties. Just remove this properties file
from sqline path and HBase connection log would be written to console.


Reply to this email directly or view it on GitHub<
https://github.com/forcedotcom/phoenix/issues/382#issuecomment-23505948>
.

Thanks & Regards,
Anil Gupta

Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23528553
.

@anilgupta84
Copy link
Author

@saket-srivastava: I am curious to know how do you use Phoenix with your secure cluster. Do you use sqlline or Squirrel?
My aim is to use one of sqlline or SQuirrel to connect to secure HBase. We want non-developers to use Phoenix to query HBase data.

Thanks,
Anil Gupta

@mujtabachohan
Copy link
Contributor

Hi Anil,

Following is the output without log4j.properties on my machine. 36/36 100% done means the number of tables+columns loaded by sqlline. Also turn --silent=false command line argument for more info in Squirrel.

13/08/29 15:56:23 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:23 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 31897@myuser
13/08/29 15:56:23 INFO zookeeper.ClientCnxn: Opening socket connection to server mymachine/10.0.53.85:2181. Will not attempt to authenticate using SASL (unknown error)
13/08/29 15:56:23 INFO zookeeper.ClientCnxn: Socket connection established to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment complete on server mymachine/10.0.53.85:2181, sessionid = 0x140cafc34a80037, negotiated timeout = 180000
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:24 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 31897@myuser
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Opening socket connection to server mymachine/10.0.53.85:2181. Will not attempt to authenticate using SASL (unknown error)
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Socket connection established to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment complete on server mymachine/10.0.53.85:2181, sessionid = 0x140cafc34a80038, negotiated timeout = 180000
13/08/29 15:56:24 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x140cafc34a80037
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Session: 0x140cafc34a80037 closed
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: EventThread shut down
13/08/29 15:56:24 INFO query.ConnectionQueryServicesImpl: LOAD: com.salesforce.phoenix.schema.TableRef@c7fba720
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:24 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 31897@myuser
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Opening socket connection to server mymachine/10.0.53.85:2181. Will not attempt to authenticate using SASL (unknown error)
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Socket connection established to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment complete on server mymachine/10.0.53.85:2181, sessionid = 0x140cafc34a80039, negotiated timeout = 180000
13/08/29 15:56:24 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x140cafc34a80039
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Session: 0x140cafc34a80039 closed
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: EventThread shut down
Building list of tables and columns for tab-completion (set fastconnect to true to skip)...
38/38 (100%) Done
Done
sqlline version 1.1.2
0: jdbc:phoenix:myzookeeper>

@anilgupta84
Copy link
Author

Hi James,

I tried invoking "hbase shell" after modifying the classpath to following
value:
*
/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hbase/lib/jruby-complete-1.6.5.jar
*
:/tmp/phoenix/phoenix-2.0.0.jar:.:/etc/hbase/conf/:/etc/hadoop/conf/:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-2.0.0-cdh4.3.0.jar:/opt/cloudera/parcels/CDH/lib/hbase/hbase-0.94.6-cdh4.3.0-security.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-auth-2.0.0-cdh4.3.0.jar:/tmp/phoenix/jline-2.11.jar:/tmp/phoenix/sqlline-1.1.2.jar:
*
/opt/cloudera/parcels/CDH/lib/hbase/lib/:/opt/cloudera/parcels/CDH-4.3.0-1.cdh4.3.0.p0.22/lib/hadoop/lib/
*

Parts highlighted in Bold Italics are extra from sqlline cp. These were
required to run HBase shell. So, i added them.

Strange thing is that, hbase shell didnt even read the cluster
configuration files. The hbase shell came up and when i ran "list" command.
It failed saying that "ERROR:
org.apache.hadoop.hbase.MasterNotRunningException: Retried 7 times". In
essence, hbase conf files are not read while invoking shell.

However, if i invoke "hbase shell" without modification to Classpath then
its able to connect to cluster. I am not sure why hbase shell is not
picking up conf files from the modified path.

Also, i identified that "hbase shell" is invoked by using only
-Djava.security.auth.login.config=jaas.conf . It does not uses other
security related parameters for kdc.

I hope i provided you the information you asked for.

Thanks,
Anil

On Thu, Aug 29, 2013 at 4:05 PM, mujtabachohan notifications@github.comwrote:

Hi Anil,

Following is the output without log4j.properties on my machine. 36/36
100% done
means the number of tables+columns loaded by sqlline. Also
turn --silent=false command line argument for more info in Squirrel.

13/08/29 15:56:23 INFO zookeeper.ZooKeeper: Initiating client connection,
connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:23 INFO zookeeper.RecoverableZooKeeper: The identifier of
this process is 31897@myuser
13/08/29 15:56:23 INFO zookeeper.ClientCnxn: Opening socket connection to
server mymachine/10.0.53.85:2181. Will not attempt to authenticate using
SASL (unknown error)
13/08/29 15:56:23 INFO zookeeper.ClientCnxn: Socket connection established
to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment
complete on server mymachine/10.0.53.85:2181, sessionid =
0x140cafc34a80037, negotiated timeout = 180000
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Initiating client connection,
connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:24 INFO zookeeper.RecoverableZooKeeper: The identifier of
this process is 31897@myuser
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Opening socket connection to
server mymachine/10.0.53.85:2181. Will not attempt to authenticate using
SASL (unknown error)
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Socket connection established
to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment
complete on server mymachine/10.0.53.85:2181, sessionid =
0x140cafc34a80038, negotiated timeout = 180000
13/08/29 15:56:24 INFO
client.HConnectionManager$HConnectionImplementation: Closed zookeeper
sessionid=0x140cafc34a80037
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Session: 0x140cafc34a80037
closed
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: EventThread shut down
13/08/29 15:56:24 INFO query.ConnectionQueryServicesImpl: LOAD:
com.salesforce.phoenix.schema.TableRef@c7fba72https://github.com/com.salesforce.phoenix.schema.TableRef/phoenix/commit/c7fba720
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Initiating client connection,
connectString=myzookeeper:2181 sessionTimeout=180000 watcher=hconnection
13/08/29 15:56:24 INFO zookeeper.RecoverableZooKeeper: The identifier of
this process is 31897@myuser
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Opening socket connection to
server mymachine/10.0.53.85:2181. Will not attempt to authenticate using
SASL (unknown error)
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Socket connection established
to mymachine/10.0.53.85:2181, initiating session
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: Session establishment
complete on server mymachine/10.0.53.85:2181, sessionid =
0x140cafc34a80039, negotiated timeout = 180000
13/08/29 15:56:24 INFO
client.HConnectionManager$HConnectionImplementation: Closed zookeeper
sessionid=0x140cafc34a80039
13/08/29 15:56:24 INFO zookeeper.ZooKeeper: Session: 0x140cafc34a80039
closed
13/08/29 15:56:24 INFO zookeeper.ClientCnxn: EventThread shut down
Building list of tables and columns for tab-completion (set fastconnect to
true to skip)...
38/38 (100%) Done
Done
sqlline version 1.1.2
0: jdbc:phoenix:myzookeeper>


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23530251
.

Thanks & Regards,
Anil Gupta

@jtaylor-sfdc
Copy link
Contributor

Hey Anil,
I'd start with getting the HBase shell to work so you're sure you have the config, classpath, etc. setup correctly. Then move on to trying with SQLLine. As I mentioned before, our guys here at Salesforce were able to get Phoenix working with a secure cluster without making any changes.
Thanks,
James

@anilgupta84
Copy link
Author

Hi James,

Agree with you. I need to see why hadoop/hbase config is not getting picked
even though its present in the classpath. Things look bad right now.

Thanks,
Anil

On Fri, Aug 30, 2013 at 1:21 AM, James Taylor notifications@github.comwrote:

Hey Anil,
I'd start with getting the HBase shell to work so you're sure you have the
config, classpath, etc. setup correctly. Then move on to trying with
SQLLine. As I mentioned before, our guys here at Salesforce were able to
get Phoenix working with a secure cluster without making any changes.
Thanks,
James


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23546998
.

Thanks & Regards,
Anil Gupta

@jtaylor-sfdc
Copy link
Contributor

Ask for help on the HBase mailing list for that - they'll be able to
diagnose the issue quickly.

James

On Aug 30, 2013, at 7:32 AM, anilgupta84 notifications@github.com wrote:

Hi James,

Agree with you. I need to see why hadoop/hbase config is not getting picked
even though its present in the classpath. Things look bad right now.

Thanks,
Anil

On Fri, Aug 30, 2013 at 1:21 AM, James Taylor notifications@github.comwrote:

Hey Anil,
I'd start with getting the HBase shell to work so you're sure you have
the
config, classpath, etc. setup correctly. Then move on to trying with
SQLLine. As I mentioned before, our guys here at Salesforce were able to
get Phoenix working with a secure cluster without making any changes.
Thanks,
James


Reply to this email directly or view it on GitHub<
https://github.com/forcedotcom/phoenix/issues/382#issuecomment-23546998>
.

Thanks & Regards,
Anil Gupta


Reply to this email directly or view it on
GitHubhttps://github.com//issues/382#issuecomment-23564951
.

@anilgupta84
Copy link
Author

@saket-srivastava and @jtaylor-sfdc : Since you are able to use sqlline.sh with cdh4.3 to connect to a secure cluster. Would you mind sharing the script that invokes sqlline. It would tremendously help me to figure out the difference between yours and mine.
Thanks,
Anil

@saket-srivastava
Copy link

Script in-line:

<SCRIPT> current_dir=$(cd $(dirname $0);pwd) phoenix_jar_path="$current_dir/../target" phoenix_client_jar=$(find $phoenix_jar_path/phoenix-*-client.jar) #phoenix_client_jar=../target/phoenix-2.0-SNAPSHOT.jar if [ -z "$1" ] then echo -e "Zookeeper not specified. \nUsage: sqlline.sh \nExample: \n 1. sqlline.sh localhost \n 2. sqlline.sh localhost ../examples/stock_symbol.sql"; exit; fi if [ "$2" ] then sqlfile="--run=$2"; fi /root/dev/current//bigdata-util/tools//Linux/jdk/jdk1.7.0_21_x64/bin/java -Djava.security.auth.login.config=/root/dev/current//bigdata-hbase/hbase/hbase//conf/zk-jaas.conf -Djava.security.krb5.realm=RHELMSGSERVICE.NET-Djava.security.krb5.kdc=rhelmsgservice.internal.salesforce.com: rhelmsgservice.internal.salesforce.com-Djava.security.krb5.conf=/home/sfdc/.keytab//krb5.conf -Dsun.security.krb5.debug=true -cp ".:/root/dev/current/bigdata-hadoop/hadoop/hadoop/etc/hadoop:/root/dev/current/bigdata-hbase/hbase/hbase/conf:/root/dev/current/bigdata-hbase/hbase/hbase/hbase-0.94.10-security.jar:/root/dev/current/bigdata-hadoop/hadoop/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar:/root/dev/current/bigdata-zookeeper/zookeeper/zookeeper/zookeeper-3.4.5.jar:/root/dev/current/bigdata-hbase/hbase/hbase/lib/hadoop-auth-2.0.0-cdh4.3.0.jar:../target/phoenix-2.0-SNAPSHOT.20130805.jar:$phoenix_client_jar" -Dlog4j.configuration=file:$current_dir/log4j.properties sqlline.SqlLine -d com.salesforce.phoenix.jdbc.PhoenixDriver -u jdbc:phoenix:$1 -n none -p none --color=true --fastConnect=false --silent=true --isolation=TRANSACTION_READ_COMMITTED $sqlfile </SCRIPT>

Cheers,
Saket.

On Tue, Sep 3, 2013 at 9:15 AM, anilgupta84 notifications@github.comwrote:

@saket-srivastava https://github.com/saket-srivastava and @jtaylor-sfdchttps://github.com/jtaylor-sfdc: Since you are able to use sqlline.sh with cdh4.3 to connect to a secure
cluster. Would you mind sharing the script that invoked sqlline. It would
tremendously help me to figure out the difference between yours and mine.


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23688398
.

@anilgupta84
Copy link
Author

@saket-srivastava: Thanks a lot. Really appreciate your help.

@anilgupta84
Copy link
Author

Hi Saket,

We are using parcels in CDH, so our directory structure for Hadoop is different from yours. I was able to map all jars in the above script to my Hadoop installation. But, i am unable to determine what is the purpose of adding "/root/dev/current/bigdata-hadoop/hadoop/hadoop/etc/hadoop" in classpath. Can you tell me what jars are used from this folder?

Also, these are some other differences between our set-up:

  1. We are using jdk1.6 while you are using jdk1.7
  2. Before, we were not adding zookeeper-3.4.5-cdh4.3.0.jar in Classpath. It seems like that is being used here.
  3. Even though, we are using cdh4.3. It seems like there is a lot of difference in our Hadoop Installation. Are you using cloudera manager for Hadoop installation? If yes, then which version?
  4. We are using cdh4.3 HBase i.e. HBase0.94.6 while you are using HBase-0.94.10.

Thanks,
Anil Gupta

@saket-srivastava
Copy link

< "/root/dev/current/bigdata-hadoop/hadoop/hadoop/etc/hadoop" >
[Saket] This folder contains the Hadoop configurations (xml)

< We are using jdk1.6 while you are using jdk1.7 >
[Saket] We migrated to JDK 1.7

< Before, we were not adding zookeeper-3.4.5-cdh4.3.0.jar in Classpath. It
seems like that is being used here. >
[Saket] Try without this - it deemed required in our setup

< Even though, we are using cdh4.3. It seems like there is a lot of
difference in our Hadoop Installation. Are you using cloudera manager for
Hadoop installation? If yes, then which version? >
[Saket] We leverage CDH 4.3.1 Hadoop + HBase 0.94.10 from open source.
We do our own custom installations sans CDH Manager

< We are using cdh4.3 HBase i.e. HBase0.94.6 while you are using
HBase-0.94.10. >
[Saket] I've not tested against 0.94.6

Cheers,
Saket.

On Tue, Sep 3, 2013 at 11:23 AM, anilgupta84 notifications@github.comwrote:

Hi Saket,

We are using parcels in CDH, so our directory structure for Hadoop is
different from yours. I was able to map all jars in the above script to my
Hadoop installation. But, i am unable to determine what is the purpose of
adding "/root/dev/current/bigdata-hadoop/hadoop/hadoop/etc/hadoop" in
classpath. Can you tell me what jars are used from this folder?

Also, these are some other differences between our set-up:

  1. We are using jdk1.6 while you are using jdk1.7
  2. Before, we were not adding zookeeper-3.4.5-cdh4.3.0.jar in Classpath.
    It seems like that is being used here.
  3. Even though, we are using cdh4.3. It seems like there is a lot of
    difference in our Hadoop Installation. Are you using cloudera manager for
    Hadoop installation? If yes, then which version?
  4. We are using cdh4.3 HBase i.e. HBase0.94.6 while you are using
    HBase-0.94.10.

Thanks,
Anil Gupta


Reply to this email directly or view it on GitHubhttps://github.com//issues/382#issuecomment-23691418
.

@anilgupta84
Copy link
Author

There we GOOOO!! After modifying your script and putting our values, Phoenix has connected to the cluster. I also ran a create table statement and it created the table. :)
Now, we will move onto next stuff for Phoenix.
Thank a lot to Phoenix team for the support.

@jtaylor-sfdc
Copy link
Contributor

Closing, as @anilgupta84 was able to get this working.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants