Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow users of kafka-avro-serializer to use any Scala version. #61

Merged
merged 1 commit into from
Jan 29, 2015

Conversation

ewencp
Copy link
Contributor

@ewencp ewencp commented Jan 27, 2015

Make Kafka a provided dependency for kafka-avro-serializer. The serializer still
needs the core jar that depends on Scala, but doesn't use any Scala
code. Instead of building multiple jars for different Scala versions, we can
instead require them to provide the dependency themselves which they'll already
be doing in order to use those interfaces. Additionally, this works without the
jar available if they only use interfaces from kafka-clients, i.e. new
producer/consumer interfaces. Fixes #60.

Make Kafka a provided dependency for kafka-avro-serializer. The serializer still
needs the core jar that depends on Scala, but doesn't use any Scala
code. Instead of building multiple jars for different Scala versions, we can
instead require them to provide the dependency themselves which they'll already
be doing in order to use those interfaces. Additionally, this works without the
jar available if they only use interfaces from kafka-clients, i.e. new
producer/consumer interfaces. Fixes #60.
@ewencp
Copy link
Contributor Author

ewencp commented Jan 27, 2015

Fixing #60 turns out to be trivial since one of Maven's scopes provides for the behavior we want. If you want to verify, I applied the patch, installed locally, then used the following two files to check if failed as expected by commenting/uncommenting the kafka dependency and sections of the test program.

<!-- pom.xml -->
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xmlns="http://maven.apache.org/POM/4.0.0"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
         http://maven.apache.org/maven-v4_0_0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <groupId>io.confluent</groupId>
    <artifactId>testing</artifactId>
    <packaging>jar</packaging>
    <name>testing</name>
    <version>0.1-SNAPSHOT</version>

    <dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.8.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka_2.11</artifactId>
            <version>0.8.2.0</version>
        </dependency>
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-avro-serializer</artifactId>
            <version>0.1-SNAPSHOT</version>
        </dependency>
    </dependencies>
</project>
// src/main/java/io/confluent/testing/Test.java
package io.confluent.testing;

import java.util.Properties;

import kafka.consumer.Consumer;
import kafka.consumer.ConsumerConfig;
import kafka.javaapi.consumer.ConsumerConnector;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;

public class Test {
    public static void main(String[] args) {
        Properties cprops = new Properties();
        cprops.put("zookeeper.connect", "localhost:2181");
        cprops.put("group.id", "foobar");
        cprops.put("consumer.id", "baz");
        ConsumerConnector consumer;
        System.out.println("Creating connector");
        consumer = Consumer.createJavaConsumerConnector(new ConsumerConfig(cprops));
        consumer.shutdown();

        try {
            Properties props = new Properties();
            props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "io.confluent.kafka.serializers.KafkaAvroSerializer");
            props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "io.confluent.kafka.serializers.KafkaAvroSerializer");
            props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
            System.out.println("Creating producer");
            KafkaProducer producer = new KafkaProducer<Object,Object>(props);
            producer.send(new ProducerRecord<Object,Object>("foo", "bar")).get();
        } catch (Exception e) {
        }
    }
}

@@ -20,6 +20,7 @@
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_${kafka.scala.version}</artifactId>
<version>${kafka.version}</version>
<scope>provided</scope>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So provided ensures that it can be used at compile time but not for packaging?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, see the description here: http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope -- basically it's used during compile of the package, but anything that runs/links the application/library are expected to provide the dependency themselves.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Got it.

@nehanarkhede
Copy link
Contributor

+1

ewencp added a commit that referenced this pull request Jan 29, 2015
Allow users of kafka-avro-serializer to use any Scala version.
@ewencp ewencp merged commit 95f5bb3 into master Jan 29, 2015
@ewencp ewencp deleted the kafka-dependency-provided branch January 29, 2015 06:03
rayokota added a commit that referenced this pull request Sep 24, 2024
* DGS-16603 Allow oneofs to be flattened in Protobuf converter (#41) (#3250)

* Add CSFLE tests with pre-canned data (#42) (#3252)

* DGS-16763 Add format query param to APIs that return a schema  (#46) (#3276)

* DGS-16791 Set field type for enum/message fields

* DGS-16763 Add format query param to APIs that return a schema

* Use concurrent hash maps in serde members (#61) (#3278)

* remove azure-identity definition, update azure-keyvault version (#49) (#3262)

* inor fix

* inor fix

---------

Co-authored-by: Jan Werner <105367074+janjwerner-confluent@users.noreply.github.com>
Co-authored-by: ConfluentSemaphore <40306929+ConfluentSemaphore@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Skip Kafka while building the avro-serializer package
2 participants