Skip to content

Commit

Permalink
fix: address code review comments.
Browse files Browse the repository at this point in the history
  • Loading branch information
dianakhuang committed Aug 18, 2022
1 parent e7e19fc commit b50e795
Show file tree
Hide file tree
Showing 4 changed files with 18 additions and 11 deletions.
2 changes: 1 addition & 1 deletion docs/decisions/0005-optional-import-of-confluent-kafka.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Context

* `confluent-kafka`_ is a library written and maintained by Confluent, our managed instance provider (see :doc:`0004-kafka-managed-hosting`). The library abstracts out the work for sending and receiving events to and from the Kafka cluster and converting them into message objects.
* confluent-kafka in turn is a wrapper around a C library called `librdkafka`_ (distributed as `librdkafka_dev`)
* librdkafka-dev does not currently have a compiled binary available for Linux/aarch64
* librdkafka-dev does not currently have a compiled binary available for Linux/aarch64, which causes issues for developers who are running Tutor on M1 Macs.

As a result of the points above, if a package includes a dependency on confluent-kafka, developers will not be able to install the package in Linux/aarch64 environments.

Expand Down
5 changes: 4 additions & 1 deletion edx_event_bus_kafka/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,13 @@
confluent_kafka = None


# return type (Optional[SchemaRegistryClient]) removed for better error messaging when confluent-kafka is not available
def create_schema_registry_client():
"""
Create a schema registry client from common settings.
Returns
None if confluent_kafka library is not available or the settings are invalid.
SchemaRegistryClient if it is.
"""
if not confluent_kafka:
warnings.warn('Library confluent-kafka not available. Cannot create schema registry client.')
Expand Down
18 changes: 10 additions & 8 deletions edx_event_bus_kafka/consumer/event_consumer.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,19 +49,21 @@ class KafkaEventConsumer:
"""

def __init__(self, topic, group_id, signal):
if confluent_kafka:
self.topic = topic
self.group_id = group_id
self.signal = signal
self.consumer = self._create_consumer()
else:
if confluent_kafka is None:
raise Exception('Library confluent-kafka not available. Cannot create event consumer.')

# return type (Optional[DeserializingConsumer]) removed for better error messaging when confluent-kafka is not
# available
self.topic = topic
self.group_id = group_id
self.signal = signal
self.consumer = self._create_consumer()

def _create_consumer(self):
"""
Create a DeserializingConsumer for events of the given signal instance.
Returns
None if confluent_kafka is not available.
DeserializingConsumer if it is.
"""

schema_registry_client = create_schema_registry_client()
Expand Down
4 changes: 3 additions & 1 deletion edx_event_bus_kafka/publishing/event_producer.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,6 @@ def get_serializer(signal: OpenEdxPublicSignal) -> AvroSignalSerializer:
# outbound-message queue and threads. The use of this cache allows the
# producers to be long-lived.

# return type (Optional[SerializingProducer]) removed for better error messaging when confluent-kafka is not available
@lru_cache
def get_producer_for_signal(signal: OpenEdxPublicSignal, event_key_field: str):
"""
Expand All @@ -136,6 +135,9 @@ def get_producer_for_signal(signal: OpenEdxPublicSignal, event_key_field: str):
signal: The OpenEdxPublicSignal to make a producer for
event_key_field: Path to the event data field to use as the event key (period-delimited
string naming the dictionary keys to descend)
Returns:
None if confluent_kafka is not defined or the settings are invalid.
SerializingProducer if it is.
Performance note:
This could be cached, but requires care such that it allows changes to settings via
Expand Down

0 comments on commit b50e795

Please sign in to comment.