Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WARN jdbc.HiveConnection: Failed to connect to localhost:10000 #24

Open
mat-ale opened this issue Apr 30, 2019 · 11 comments
Open

WARN jdbc.HiveConnection: Failed to connect to localhost:10000 #24

mat-ale opened this issue Apr 30, 2019 · 11 comments

Comments

@mat-ale
Copy link

mat-ale commented Apr 30, 2019

Hi,

this is my configuration:

hive-server:
    container_name:           hive-server
    image:                    bde2020/hive:2.3.2-postgresql-metastore
    env_file:
          - ./hive_build/hadoop-hive.env
    environment:
        HIVE_CORE_CONF_javax_jdo_option_ConnectionURL: "jdbc:postgresql://hive-metastore/metastore"
        SERVICE_PRECONDITION: "hive-metastore:9083"
    ports:
        - "10000:10000"
    
hive-metastore:
    container_name:           hive-metastore
    image:                    bde2020/hive:2.3.2-postgresql-metastore
    env_file:
        - ./hive_build/hadoop-hive.env
    command:                  /opt/hive/bin/hive --service metastore
    environment:
        SERVICE_PRECONDITION: "hadoop-namenode:50070 hadoop-datanode1:50075 hive-metastore-postgresql:5432"
    ports:
        - "9083:9083"

hive-metastore-postgresql:
    container_name:           hive-metastore-postgresql
    image:                    bde2020/hive-metastore-postgresql:2.3.0
    ports:
        - "5433:5432"

and HIVE should be connected to 2 more HDFS containers (hadoop-namenode, hadoop-datanode1) that I have built and that are working just great on the right ports.

When I run:

 docker-compose exec hive-server bash
 /opt/hive/bin/beeline -u jdbc:hive2://localhost:10000

I get:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000
19/04/30 12:21:53 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
Beeline version 2.3.2 by Apache Hive
beeline>

From docker-compose logs I don't see specific errors so the containers seem to work fine.

Any help on this please?
Thanks

@marcuslind90
Copy link

I'm experiencing similar issues, the README file is not explicit enough.

@ntallapa12
Copy link

I am trying to connect from my local machine
beeline> !connect jdbc:hive2://hive-server:10000

I get unknownhostexception, please let us know on how to make beeline or jdbc connection

@ntallapa12
Copy link

networks:
common-network:
driver: overlay

adding this in docker-compose resolved the issue for me

@muzammil-irshad
Copy link

@mat-ale did you resolve your issue?

@purbanow
Copy link

I have same problem

@dhirendra31pandit
Copy link

I too have the same issue and not be able to fix it.

@jessequinn
Copy link

any resolution?

@dhirendra31pandit
Copy link

I have solved it by putting the ip address instead of localhost. As i had vm machine and your local desktop and vm localhost represent two machines. I hope this will solve your issues.

@jessequinn
Copy link

jessequinn commented Nov 11, 2020

the following should resolve the issue: beeline -u jdbc:hive2:// literally. do not include any host or port and let beeline figure it out. This seems to work well (embedded mode); however, remote mode, even with configuring with nosasl does not appear to work.

@jessequinn
Copy link

jessequinn commented Nov 12, 2020

OK now it is resolved:

The following needs to be done:

[startup.sh]

#!/bin/bash

hadoop fs -mkdir       /tmp
hadoop fs -mkdir -p    /user/hive/warehouse
hadoop fs -chmod g+w   /tmp
hadoop fs -chmod g+w   /user/hive/warehouse

cd $HIVE_HOME/bin
./hive --service hiveserver2 --hiveconf hive.server2.thrift.port=10000 --hiveconf hive.root.logger=INFO,console --hiveconf hive.server2.enable.doAs=false

you probably do not need --hiveconf hive.server2.thrift.port=10000 as i have also added to the hadoop-hive.env, but it doesnt hurt.

--hiveconf hive.root.logger=INFO,console gives me more details about problems; this is how i resolved this specific issue.

add to hadoop-hive.env and hadoop.env

CORE_CONF_hadoop_proxyuser_hive_hosts=*

and to hadoop-hive.env

HIVE_SITE_CONF_hive_server2_thrift_bind_host=0.0.0.0
HIVE_SITE_CONF_hive_server2_thrift_port=10000
HIVE_SITE_CONF_hive_metastore_event_db_notification_api_auth=false

@rusonding
Copy link

Logging initialized using configuration in file:/opt/hive/conf/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:378)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:320)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:678)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619)
at org.apache

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants