Skip to content

mserrate/kafka-streams-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

kafka-streams-app

That's a preview feature, so use it at your own risk.

Prerequisites

Build the trunk of Apache Kafka (0.10) or use the one by Confluent

Create input & output Kafka topics

bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic streams-hashtag-input

bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic streams-hashtag-count-output

Execute the tweet producer

Set up your Twitter credentials at TweetProducer and execute it to store tweets at streams-hashtag-input queue:

private static final String CONSUMER_KEY = "";
private static final String CONSUMER_SECRET = "";
private static final String TOKEN = "";
private static final String SECRET = "";

Execute the HashtagJob

Read the topic with hashtags and counts

bin/kafka-console-consumer.sh --zookeeper localhost:2181 \
          --topic streams-hashtag-count-output \
          --from-beginning \
          --formatter kafka.tools.DefaultMessageFormatter \
          --property print.key=true \
          --property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
          --property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer

About

A Kafka Streams sample using twitter API

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages