Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NoSuchMethodError occurs while trying to connect to Cloud Bigtable from a Cloud Dataflow worker #613

Closed
yosssi opened this issue Dec 25, 2015 · 5 comments
Assignees
Labels
api: bigtable Issues related to the googleapis/java-bigtable-hbase API.

Comments

@yosssi
Copy link

yosssi commented Dec 25, 2015

I set up a Cloud Dataflow Pipeline in accordance with this article: https://cloud.google.com/bigtable/docs/dataflow-hbase

When I submit it to the Cloud Dataflow managed service, I got the following error at a Cloud Dataflow worker:

Uncaught exception in main thread. Exiting with status code 1.
java.lang.NoSuchMethodError: io.grpc.netty.GrpcSslContexts.forClient()Lcom/google/bigtable/repackaged/io/netty/handler/ssl/SslContextBuilder;
at com.google.cloud.bigtable.grpc.BigtableSession.createSslContext(BigtableSession.java:98)
at com.google.cloud.bigtable.grpc.BigtableSession.access$000(BigtableSession.java:82)
at com.google.cloud.bigtable.grpc.BigtableSession$1.run(BigtableSession.java:151)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

How should I handle this problem?

My Cloud Dataflow Pipeline source code is the following:

package mypackage

import com.google.cloud.bigtable.dataflow.CloudBigtableIO;
import com.google.cloud.bigtable.dataflow.CloudBigtableOptions;
import com.google.cloud.bigtable.dataflow.CloudBigtableTableConfiguration;
import com.google.cloud.dataflow.sdk.Pipeline;
import com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory;
import com.google.cloud.dataflow.sdk.transforms.Create;
import com.google.cloud.dataflow.sdk.transforms.DoFn;
import com.google.cloud.dataflow.sdk.transforms.ParDo;
import org.apache.hadoop.hbase.client.Mutation;
import org.apache.hadoop.hbase.client.Put;

public class Main {
    // Create a DoFn that creates a Put or Delete.  MUTATION_TRANSFORM is a simplistic example.
    static final DoFn<String, Mutation> MUTATION_TRANSFORM = new DoFn<String, Mutation>() {
        @Override
        public void processElement(DoFn<String, Mutation>.ProcessContext c) throws Exception {
            c.output(new Put(c.element().getBytes()).addColumn("v".getBytes(), "v".getBytes(), "value".getBytes()));
        }
    };

    public static void main(String[] args) {
        // CloudBigtableOptions is one way to retrieve the options.  It's not required to use this
        // specific PipelineOptions extension; CloudBigtableOptions is there as a convenience.
        CloudBigtableOptions options =
                PipelineOptionsFactory.fromArgs(args).withValidation().as(CloudBigtableOptions.class);

        // CloudBigtableTableConfiguration contains the project, zone, cluster and table to connect to
        CloudBigtableTableConfiguration config = CloudBigtableTableConfiguration.fromCBTOptions(options);

        Pipeline p = Pipeline.create(options);
        // This sets up serialization for Puts and Deletes so that Dataflow can potentially move them through
        // the network
        CloudBigtableIO.initializeForWrite(p);

        p
                .apply(Create.of("Hello", "World"))
                .apply(ParDo.of(MUTATION_TRANSFORM))
                .apply(CloudBigtableIO.writeToTable(config));

        p.run();
    }
}
@sduskis
Copy link
Contributor

sduskis commented Dec 25, 2015

This looks like a subtle dependency issue. Are you using maven? If so, can you post your dependencies?

@yosssi
Copy link
Author

yosssi commented Dec 26, 2015

@sduskis Thank you for your reply. Yes, I'm using Maven. Here are my dependencies:

<dependencies>
  <dependency>
    <groupId>com.google.cloud.dataflow</groupId>
    <artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
    <version>1.3.0</version>
  </dependency>
  <dependency>
    <groupId>com.google.cloud.bigtable</groupId>
    <artifactId>bigtable-hbase-dataflow</artifactId>
    <version>0.2.2</version>
  </dependency>
  <dependency>
    <groupId>junit</groupId>
    <artifactId>junit</artifactId>
    <version>3.8.1</version>
    <scope>test</scope>
  </dependency>
</dependencies>

@sduskis
Copy link
Contributor

sduskis commented Dec 28, 2015

This problem is the same as: #613

I think that the problem here is that both Dataflow and Bigtable include io.grpc. Bigtable uses the shading plugin and changes package names, but didn't change the io.grpc package names, as described here: #582

Your best bet to get around the problem is to use the 0.2.3-SNAPSHOT version of bigtable-hbase. You'll have to add the following in your pom.xml to be able to use SNAPSHOTs:

  <repositories>
    <repository>
      <id>snapshots-repo</id>
      <url>https://oss.sonatype.org/content/repositories/snapshots</url>
      <releases><enabled>false</enabled></releases>
      <snapshots><enabled>true</enabled></snapshots>
    </repository>
  </repositories>

We'll release an official version ASAP in the new year.

@yosssi
Copy link
Author

yosssi commented Dec 28, 2015

@sduskis Thanks a lot! I could put records to Cloud Bigtable successfully by using 0.2.3-SNAPSHOT version of bigtable-hbase-dataflow.

I'm looking forward to getting an official version.

@sduskis
Copy link
Contributor

sduskis commented Dec 29, 2015

@sduskis sduskis closed this as completed Dec 29, 2015
@google-cloud-label-sync google-cloud-label-sync bot added the api: bigtable Issues related to the googleapis/java-bigtable-hbase API. label Jan 31, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigtable Issues related to the googleapis/java-bigtable-hbase API.
Projects
None yet
Development

No branches or pull requests

2 participants