-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Closed
Labels
api: monitoringIssues related to the Cloud Monitoring API.Issues related to the Cloud Monitoring API.type: questionRequest for information or clarification. Not an issue.Request for information or clarification. Not an issue.
Description
When trying to instantiate a MetricServiceClient in order to perform Google Stackdriver Monitoring using the following code in a DataProc cluster image 1.0:
MetricServiceSettings settings = MetricServiceSettings.defaultBuilder()
.setCredentialsProvider(credentialsProvider)
.build();
MetricServiceClient metricServiceClient = MetricServiceClient.create(settings);I get the following stack trace:
Exception in thread "main" java.lang.NoClassDefFoundError: org/spark-project/jetty/alpn/ALPN$Provider
at io.netty.handler.ssl.JdkAlpnApplicationProtocolNegotiator$1.<init>(JdkAlpnApplicationProtocolNegotiator.java:26)
at io.netty.handler.ssl.JdkAlpnApplicationProtocolNegotiator.<clinit>(JdkAlpnApplicationProtocolNegotiator.java:24)
at io.netty.handler.ssl.JdkSslContext.toNegotiator(JdkSslContext.java:237)
at io.netty.handler.ssl.JdkSslClientContext.<init>(JdkSslClientContext.java:189)
at io.netty.handler.ssl.SslContext.newClientContextInternal(SslContext.java:729)
at io.netty.handler.ssl.SslContextBuilder.build(SslContextBuilder.java:223)
at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(NettyChannelBuilder.java:470)
at io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:338)
at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:305)
at com.google.api.gax.grpc.InstantiatingChannelProvider.createChannel(InstantiatingChannelProvider.java:125)
at com.google.api.gax.grpc.InstantiatingChannelProvider.getChannel(InstantiatingChannelProvider.java:110)
at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:98)
at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:59)
at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:97)
at com.google.cloud.monitoring.v3.stub.GrpcMetricServiceStub.create(GrpcMetricServiceStub.java:161)
at com.google.cloud.monitoring.v3.MetricServiceSettings.createStub(MetricServiceSettings.java:195)
at com.google.cloud.monitoring.v3.MetricServiceClient.<init>(MetricServiceClient.java:140)
at com.google.cloud.monitoring.v3.MetricServiceClient.create(MetricServiceClient.java:122)
This is the dependency in the pom.xml file (spark version 1.6.3):
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-monitoring</artifactId>
<version>0.22.0-alpha</version>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.10</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.11</artifactId>
<version>${spark.csv.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
Any advice as to how to resolve this issue would be appreciated. Thanks.
Edit: I have no issue when running my program locally, the issue only appears when running on the Dataproc cluster.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
api: monitoringIssues related to the Cloud Monitoring API.Issues related to the Cloud Monitoring API.type: questionRequest for information or clarification. Not an issue.Request for information or clarification. Not an issue.