Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load the library 'com_google_cloud_spark_bigquery_repackaged_netty_tcnative_windows_x86_64' #326

Closed
AlixMetivier-zz opened this issue Mar 1, 2021 · 2 comments
Assignees

Comments

@AlixMetivier-zz
Copy link

Hello, I'm trying to use the connector to read a table content, but it fails with that error. I've tried with 0.19.0, 0.18.1 and 0.17.3 but I'm getting the same error
my code :

ss.read().format("bigquery").option("project", PROJECTID).option("parentProject", PROJECTID).option("credentialsFile", PATHTOCREDENTIAL).option("temporaryGcsBucket", TEMPBUCKET)
                                            .option("checkpointLocation", EU)
                                            .load(PROJECTID.DATASETID.TABLE).show()
java.lang.UnsatisfiedLinkError: no com_google_cloud_spark_bigquery_repackaged_netty_tcnative_windows_x86_64 in java.library.path
	at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1860) ~[?:1.8.0_251]
	at java.lang.Runtime.loadLibrary0(Runtime.java:870) ~[?:1.8.0_251]
	at java.lang.System.loadLibrary(System.java:1122) ~[?:1.8.0_251]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38) ~[spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:371) ~[spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_251]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:363) ~[spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:341) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:136) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:96) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:581) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:133) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.grpc.netty.GrpcSslContexts.defaultSslProvider(GrpcSslContexts.java:217) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:144) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:93) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:414) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:513) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:314) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1600(InstantiatingGrpcChannelProvider.java:71) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:210) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:217) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:200) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.api.gax.rpc.ClientContext.create(ClientContext.java:166) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.storage.v1.stub.EnhancedBigQueryReadStub.create(EnhancedBigQueryReadStub.java:89) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.storage.v1.BigQueryReadClient.<init>(BigQueryReadClient.java:129) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.storage.v1.BigQueryReadClient.create(BigQueryReadClient.java:110) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.direct.DirectBigQueryRelation$.createReadClient(DirectBigQueryRelation.scala:354) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.direct.DirectBigQueryRelation$$anonfun$$lessinit$greater$default$3$1.apply(DirectBigQueryRelation.scala:44) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.direct.DirectBigQueryRelation$$anonfun$$lessinit$greater$default$3$1.apply(DirectBigQueryRelation.scala:44) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at com.google.cloud.spark.bigquery.direct.DirectBigQueryRelation.buildScan(DirectBigQueryRelation.scala:127) [spark-bigquery-with-dependencies_2.11-0.17.3.jar:0.17.3]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:293) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:293) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:326) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:325) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProjectRaw(DataSourceStrategy.scala:403) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProject(DataSourceStrategy.scala:321) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.datasources.DataSourceStrategy.apply(DataSourceStrategy.scala:289) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) [scala-library-2.11.8.jar:?]
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) [scala-library-2.11.8.jar:?]
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:439) [scala-library-2.11.8.jar:?]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) [scala-library-2.11.8.jar:?]
	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) [scala-library-2.11.8.jar:?]
	at scala.collection.Iterator$class.foreach(Iterator.scala:893) [scala-library-2.11.8.jar:?]
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) [scala-library-2.11.8.jar:?]
	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) [scala-library-2.11.8.jar:?]
	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336) [scala-library-2.11.8.jar:?]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) [scala-library-2.11.8.jar:?]
	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) [scala-library-2.11.8.jar:?]
	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) [spark-catalyst_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3360) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.head(Dataset.scala:2545) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.take(Dataset.scala:2759) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.getRows(Dataset.scala:255) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.showString(Dataset.scala:292) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.show(Dataset.scala:746) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.show(Dataset.scala:705) [spark-sql_2.11-2.4.0.jar:2.4.0]
	at org.apache.spark.sql.Dataset.show(Dataset.scala:714) [spark-sql_2.11-2.4.0.jar:2.4.0]
@AlixMetivier-zz
Copy link
Author

it seems that netty_tcnative_windows_x86_64.dll is in META-INF/native, but it was not renamed to libcom_google_cloud_spark_bigquery_repackaged_netty_tcnative_windows_x86_64.dll
I tried doing it myself but I think its not possible to do it because the length of the file name is too long ?

@davidrabinowitz
Copy link
Member

Fixed in version 0.20.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants