You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3; .NET4.0C; .NET4.0E; MS-RTC LM 8)
Build Identifier:
We have problems connecting to a MonetDB Cluster via Apache Spark using JDBC. Connection to a database that is not clustered works. But when we try to connect to a clustered MonetDB database via Apache Spark, it fails with a "unhandled result type error". The entire query is given below
PREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0
1.Setup a funnel or a monetdb cluster
2.issue statement via jdbc PREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0
3.Software gives unhandled result type error
Actual Results:
Software gives unhandled result type error
val v1 = hiveContext.load("jdbc",Map("url" -> "jdbc:monetdb://1.1.1.1/tpch1?user=monetdb&password=monetdb","dbtable" -> "(select count(*) from customer)v1"))
*java.sql.SQLException: node /tpch/1/monet returned unhandled result type
java.sql.SQLException: node */tpch/2/monet returned unhandled result type
at nl.cwi.monetdb.jdbc.MonetConnection$ResponseList.executeQuery(MonetConnection.java:2536)
at nl.cwi.monetdb.jdbc.MonetConnection$ResponseList.processQuery(MonetConnection.java:2284)
at nl.cwi.monetdb.jdbc.MonetStatement.internalExecute(MonetStatement.java:508)
at nl.cwi.monetdb.jdbc.MonetStatement.execute(MonetStatement.java:349)
at nl.cwi.monetdb.jdbc.MonetPreparedStatement.(MonetPreparedStatement.java:118)
at nl.cwi.monetdb.jdbc.MonetConnection.prepareStatement(MonetConnection.java:901)
at nl.cwi.monetdb.jdbc.MonetConnection.prepareStatement(MonetConnection.java:825)
at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:96)
at org.apache.spark.sql.jdbc.JDBCRelation.(JDBCRelation.scala:125)
at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:114)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:290)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:679)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:23)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:28)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:30)
at $iwC$$iwC$$iwC$$iwC$$iwC.(:32)
at $iwC$$iwC$$iwC$$iwC.(:34)
at $iwC$$iwC$$iwC.(:36)
at $iwC$$iwC.(:38)
at $iwC.(:40)
at (:42)
at .(:46)
at .()
at .(:7)
at .()
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
JDBC Log
RD 1429856293844: read final block: 67 bytes RX 1429856293845:
bwnthqobupI3CPY:merovingian:9:RIPEMD160,SHA256,SHA1,MD5:LIT:SHA512: RD 1429856293845: inserting prompt TD 1429856293846: write final block: 99 bytes TX 1429856293846: BIG:merovingian:{SHA256}7070477fd396595c929386453a32ac5059a9b76976e54dab3fbb7a8f5299bd25:sql:tpch1: RD 1429856293847: read final block: 0 bytes RX 1429856293847: RD 1429856293847: inserting prompt TD 1429856293847: write final block: 49 bytes TX 1429856293847: sSET TIME ZONE INTERVAL '+05:30' HOUR TO MINUTE ; RD 1429856293848: read final block: 3 bytes RX 1429856293848: &3
RD 1429856293848: inserting prompt TD 1429856293855: write final block: 15 bytes TX 1429856293855: Xreply_size 250 RD 1429856293855: read final block: 0 bytes RX 1429856293855: RD 1429856293855: inserting prompt TD 1429856293855: write final block: 68 bytes TX 1429856293855: sPREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0 ; RD 1429856293856: read final block: 52 bytes RX 1429856293856: !node */tpch/2/monet returned unhandled result type
RD 1429856293856: inserting prompt
The text was updated successfully, but these errors were encountered:
This is not a bug in monetdb-java but in MonetDB itself. The error message is generated in tools/merovingian/daemon/multiplex-funnel.c line 656 in the default case of a switch statement which handles the cases Q_PARSE, Q_TABLE, Q_UPDATE, Q_SCHEMA and Q_TRANS, but not Q_PREPARE and Q_BLOCK. Those probably didn't exist when the funnel code was written.
The failing query was a PREPARE statement so it's probably failing on Q_PREPARE.
Date: 2015-04-24 13:33:35 +0200
From: karthik <<karthik_sitaram>>
To: clients devs <>
Version: 11.21.19 (Jul2015-SP4)
Last updated: 2016-04-11 11:18:37 +0200
Comment 20843
Date: 2015-04-24 13:33:35 +0200
From: karthik <<karthik_sitaram>>
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3; .NET4.0C; .NET4.0E; MS-RTC LM 8)
Build Identifier:
We have problems connecting to a MonetDB Cluster via Apache Spark using JDBC. Connection to a database that is not clustered works. But when we try to connect to a clustered MonetDB database via Apache Spark, it fails with a "unhandled result type error". The entire query is given below
PREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0
http://stackoverflow.com/questions/29747500/apache-spark-fails-to-connect-to-monetdb-cluster-using-jdbc-driver
Reproducible: Always
Steps to Reproduce:
1.Setup a funnel or a monetdb cluster
2.issue statement via jdbc PREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0
3.Software gives unhandled result type error
Actual Results:
Software gives unhandled result type error
val v1 = hiveContext.load("jdbc",Map("url" -> "jdbc:monetdb://1.1.1.1/tpch1?user=monetdb&password=monetdb","dbtable" -> "(select count(*) from customer)v1"))
*java.sql.SQLException: node /tpch/1/monet returned unhandled result type
java.sql.SQLException: node */tpch/2/monet returned unhandled result type$iwC$ $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:23)$iwC$ $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:28)$iwC$ $iwC$$iwC$$iwC$$iwC$$iwC.(:30)$iwC$ $iwC$$iwC$$iwC$$iwC.(:32)$iwC$ $iwC$$iwC$$iwC.(:34)$iwC$ $iwC$$iwC.(:36)$iwC$ $iwC.(:38)
at nl.cwi.monetdb.jdbc.MonetConnection$ResponseList.executeQuery(MonetConnection.java:2536)
at nl.cwi.monetdb.jdbc.MonetConnection$ResponseList.processQuery(MonetConnection.java:2284)
at nl.cwi.monetdb.jdbc.MonetStatement.internalExecute(MonetStatement.java:508)
at nl.cwi.monetdb.jdbc.MonetStatement.execute(MonetStatement.java:349)
at nl.cwi.monetdb.jdbc.MonetPreparedStatement.(MonetPreparedStatement.java:118)
at nl.cwi.monetdb.jdbc.MonetConnection.prepareStatement(MonetConnection.java:901)
at nl.cwi.monetdb.jdbc.MonetConnection.prepareStatement(MonetConnection.java:825)
at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:96)
at org.apache.spark.sql.jdbc.JDBCRelation.(JDBCRelation.scala:125)
at org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:114)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:290)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:679)
at
at
at
at
at
at
at
at $iwC.(:40)
at (:42)
at .(:46)
at .()
at .(:7)
at .()
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
JDBC Log
RD 1429856293844: read final block: 67 bytes RX 1429856293845:
bwnthqobupI3CPY:merovingian:9:RIPEMD160,SHA256,SHA1,MD5:LIT:SHA512: RD 1429856293845: inserting prompt TD 1429856293846: write final block: 99 bytes TX 1429856293846: BIG:merovingian:{SHA256}7070477fd396595c929386453a32ac5059a9b76976e54dab3fbb7a8f5299bd25:sql:tpch1: RD 1429856293847: read final block: 0 bytes RX 1429856293847: RD 1429856293847: inserting prompt TD 1429856293847: write final block: 49 bytes TX 1429856293847: sSET TIME ZONE INTERVAL '+05:30' HOUR TO MINUTE ; RD 1429856293848: read final block: 3 bytes RX 1429856293848: &3
RD 1429856293848: inserting prompt TD 1429856293855: write final block: 15 bytes TX 1429856293855: Xreply_size 250 RD 1429856293855: read final block: 0 bytes RX 1429856293855: RD 1429856293855: inserting prompt TD 1429856293855: write final block: 68 bytes TX 1429856293855: sPREPARE SELECT * FROM (select count(*) from customer)v1 WHERE 1=0 ; RD 1429856293856: read final block: 52 bytes RX 1429856293856: !node */tpch/2/monet returned unhandled result type
RD 1429856293856: inserting prompt
The text was updated successfully, but these errors were encountered: