Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TimeoutTransportException #676

Open
mufukuri opened this issue Oct 25, 2015 · 4 comments
Open

TimeoutTransportException #676

mufukuri opened this issue Oct 25, 2015 · 4 comments

Comments

@mufukuri
Copy link

Hello,

I am trying to use the elasticsearch-jdbc to move data from MySLQ to my local Elasticsearch server but I am getting a TimeoutTransportExeption every time. This is my Elastic search configuration file:

Additionally I have also added the error in the jdbc.log and the script file am running. Can you please assist.

@jprante
Copy link
Owner

jprante commented Oct 25, 2015

How can I assist if you do not show the error message and your JDBC importer command?

There is no known exception like TimeoutTransportException, can you please clarify.

@mufukuri
Copy link
Author

My apologizes I failed to upload the log file. This is the error
[01:03:54,366][INFO ][importer.jdbc ][pool-2-thread-1] strategy standard: settings = {password=password, user=root, elasticsearch.cluster=elasticsearch, elasticsearch.host=localhost, index=products, max_bulk_actions=20000, treat_binary_as_string=true, elasticsearch.port=9200, sql=select * from products, url=jdbc:mysql://localhost:3306/products, max_concurrent_bulk_requests=10}, context = org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext@7c697b82
[01:03:54,373][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found sink class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink@229a439f
[01:03:54,400][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found source class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource@2765a9e0
[01:03:54,561][INFO ][BaseTransportClient ][pool-2-thread-1] creating transport client, java version 1.8.0_05, effective settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
[01:03:54,825][INFO ][org.elasticsearch.plugins][pool-2-thread-1] [importer] loaded [support-1.7.2.1-a140d30], sites []
[01:03:56,424][INFO ][BaseTransportClient ][pool-2-thread-1] trying to connect to [inet[localhost/127.0.0.1:9200]]
[01:03:56,501][ERROR][importer.jdbc.context.standard][pool-2-thread-1] ingest not properly build, shutting down ingest
org.elasticsearch.client.transport.NoNodeAvailableException: no cluster nodes available, check settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
at org.xbib.elasticsearch.support.client.BaseTransportClient.createClient(BaseTransportClient.java:51) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.BaseIngestTransportClient.newClient(BaseIngestTransportClient.java:22) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.transport.BulkTransportClient.newClient(BulkTransportClient.java:89) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext$1.create(StandardContext.java:445) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.beforeFetch(StandardSink.java:95) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.beforeFetch(StandardContext.java:207) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:188) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:118) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:241) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:57) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:86) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:17) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_05]
[01:03:56,631][WARN ][importer.jdbc.sink.standard][pool-2-thread-1] no ingest found
[01:03:57,409][ERROR][importer.jdbc.source.standard][pool-2-thread-1] while opening read connection: jdbc:mysql://localhost:3306/products Unknown database 'products'
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'products'
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_05]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_05]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_05]
at java.lang.reflect.Constructor.newInstance(Constructor.java:408) ~[?:1.8.0_05]
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.Util.getInstance(Util.java:387) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:941) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3870) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3806) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1686) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1207) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2254) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2285) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2084) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:795) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:44) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_05]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_05]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_05]
at java.lang.reflect.Constructor.newInstance(Constructor.java:408) ~[?:1.8.0_05]
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327) ~[mysql-connector-java-5.1.37-bin.jar:5.1.37]
at java.sql.DriverManager.getConnection(DriverManager.java:664) ~[?:1.8.0_05]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_05]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource.getConnectionForReading(StandardSource.java:493) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource.execute(StandardSource.java:678) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource.fetch(StandardSource.java:605) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.fetch(StandardContext.java:215) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:190) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:118) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:241) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:57) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:86) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:17) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_05]
[01:04:58,350][INFO ][importer.jdbc ][main] index name = products, concrete index name = products
[01:04:58,370][INFO ][importer.jdbc ][pool-2-thread-1] strategy standard: settings = {password=password, user=root, elasticsearch.cluster=elasticsearch, elasticsearch.host=localhost, index=products, max_bulk_actions=20000, treat_binary_as_string=true, elasticsearch.port=9200, sql=select * from products, url=jdbc:mysql://localhost:3306/elastic, max_concurrent_bulk_requests=10}, context = org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext@7c697b82
[01:04:58,373][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found sink class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink@229a439f
[01:04:58,381][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found source class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource@2765a9e0
[01:04:58,431][INFO ][BaseTransportClient ][pool-2-thread-1] creating transport client, java version 1.8.0_05, effective settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
[01:04:58,491][INFO ][org.elasticsearch.plugins][pool-2-thread-1] [importer] loaded [support-1.7.2.1-a140d30], sites []
[01:04:59,271][INFO ][BaseTransportClient ][pool-2-thread-1] trying to connect to [inet[localhost/127.0.0.1:9200]]
[01:04:59,305][ERROR][importer.jdbc.context.standard][pool-2-thread-1] ingest not properly build, shutting down ingest
org.elasticsearch.client.transport.NoNodeAvailableException: no cluster nodes available, check settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
at org.xbib.elasticsearch.support.client.BaseTransportClient.createClient(BaseTransportClient.java:51) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.BaseIngestTransportClient.newClient(BaseIngestTransportClient.java:22) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.transport.BulkTransportClient.newClient(BulkTransportClient.java:89) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext$1.create(StandardContext.java:445) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.beforeFetch(StandardSink.java:95) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.beforeFetch(StandardContext.java:207) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:188) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:118) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:241) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:57) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:86) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:17) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_05]
[01:04:59,345][WARN ][importer.jdbc.sink.standard][pool-2-thread-1] no ingest found
[01:04:59,928][WARN ][BulkTransportClient ][Thread-1] no client
[01:07:41,528][INFO ][importer.jdbc ][main] index name = products, concrete index name = products
[01:07:41,549][INFO ][importer.jdbc ][pool-2-thread-1] strategy standard: settings = {password=password, user=root, elasticsearch.cluster=elasticsearch, elasticsearch.host=localhost, index=products, max_bulk_actions=20000, treat_binary_as_string=true, elasticsearch.port=9200, sql=select * from products, url=jdbc:mysql://localhost:3306/elastic, max_concurrent_bulk_requests=10}, context = org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext@23f42af5
[01:07:41,553][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found sink class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink@3951dd63
[01:07:41,561][INFO ][importer.jdbc.context.standard][pool-2-thread-1] found source class org.xbib.elasticsearch.jdbc.strategy.standard.StandardSource@36dca21f
[01:07:41,612][INFO ][BaseTransportClient ][pool-2-thread-1] creating transport client, java version 1.8.0_05, effective settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
[01:07:41,690][INFO ][org.elasticsearch.plugins][pool-2-thread-1] [importer] loaded [support-1.7.2.1-a140d30], sites []
[01:07:42,439][INFO ][BaseTransportClient ][pool-2-thread-1] trying to connect to [inet[localhost/127.0.0.1:9200]]
[01:07:47,489][INFO ][org.elasticsearch.client.transport][pool-2-thread-1] [importer] failed to get node info for [#transport#-1][apps][inet[localhost/127.0.0.1:9200]], disconnecting...
org.elasticsearch.transport.ReceiveTimeoutTransportException: [][inet[localhost/127.0.0.1:9200]][cluster:monitor/nodes/info] request_id [0] timed out after [5001ms]
at org.elasticsearch.transport.TransportService$TimeoutHandler.run(TransportService.java:529) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_05]
[01:07:47,509][ERROR][importer.jdbc.context.standard][pool-2-thread-1] ingest not properly build, shutting down ingest
org.elasticsearch.client.transport.NoNodeAvailableException: no cluster nodes available, check settings {cluster.name=elasticsearch, host.0=localhost, port=9200, sniff=false, autodiscover=false, name=importer, client.transport.ignore_cluster_name=false, client.transport.ping_timeout=5s, client.transport.nodes_sampler_interval=5s}
at org.xbib.elasticsearch.support.client.BaseTransportClient.createClient(BaseTransportClient.java:51) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.BaseIngestTransportClient.newClient(BaseIngestTransportClient.java:22) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.support.client.transport.BulkTransportClient.newClient(BulkTransportClient.java:89) ~[elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext$1.create(StandardContext.java:445) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.beforeFetch(StandardSink.java:95) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.beforeFetch(StandardContext.java:207) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:188) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:118) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:241) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.tools.Importer.newRequest(Importer.java:57) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:86) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:17) [elasticsearch-jdbc-1.7.2.1-uberjar.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_05]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_05]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_05]
[01:07:52,507][INFO ][org.elasticsearch.client.transport][elasticsearch[importer][generic][T#1]] [importer] failed to get node info for [#transport#-1][apps][inet[localhost/127.0.0.1:9200]], disconnecting...
org.elasticsearch.transport.ReceiveTimeoutTransportException: [][inet[localhost/127.0.0.1:9200]][cluster:monitor/nodes/info] request_id [1] timed out after [5000ms]

@jprante
Copy link
Owner

jprante commented Oct 25, 2015

There are numerous mistakes.

Why do you configure HTTP port 9200? Use port 9300.

JDBC importer can not find the cluster.

There is no table products in MySQL.

@mufukuri
Copy link
Author

Thank you. I switched to port 9300 and it started working. Is 9200 only for http acces? The database does exist in MySQL and was never a problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants