Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR Final to Master #6

Open
wants to merge 44 commits into
base: Final
Choose a base branch
from
Open

PR Final to Master #6

wants to merge 44 commits into from

Conversation

StephenDRoberts
Copy link
Owner

No description provided.

README.md Outdated
@@ -2,11 +2,80 @@

Dissertation project for Sheffield Hallam University MSc in Digital & Technology Solutions

#### Overview
## Overview

The purpose of this application is to compare the read and write performances of
a Kafka Streams backed State store against a MongoDB based application.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd mention here that you are using spring boot as that could heavily influence your results.

Make it a test between Spring Kafka Streams, & Spring MongoDB

README.md Outdated
The setup script will create our data set and ideally we only want to run the script once to avoid extra work.

To enable this, we have provided two shell scripts that run as pre-launch configurations for both the MongoDB application, and the Kafka streams version.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As mentioned on slack we. From experience some reviewers can be picky about I/you/he/she/we/they

Would just write "to enable this, there are two shell scripts..."

README.md Outdated
2. run the application through your IDEs run functionality.

The application will the process all the messages in the `message-topic` kafka topic.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The application will the

Typo


broker:
image: wurstmeister/kafka:2.12-2.3.0
# image: confluentinc/cp-kafka:5.4.0
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused code


import java.time.Instant

data class TimingsSummary (
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason these are nullable?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's because when I call val min = allTimings.minOrNull() in the MessageController. Originally I tried min() but it told me it's deprecated and I should use minOrNull(). I tried to look around it but couldn't find anything obvious

import org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory

@Configuration
class SpringMongoConfig (
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can do all this in your application.yaml if you want

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spring should create the beans below for you by default I believe

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you mean? I thought I needed to use the @Value to bring in the properties from application.yaml and then create the classes? Is it with a certain annotation?

@KafkaListener(id = "mongo-consumer", topics = ["message-topic"])
fun readMessagesFromKafka(message: String) {
logger.info { "Receiving message: $message"}
messageRepository.placeMessageToDB(message)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

save?

interface MessageRepositoryInterface : MongoRepository<Message, String> {}

@Repository
class MessageRepository(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is more of a MessageService to me

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would then enable you to call the interface above MessageRepository

key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
properties:
"spring.json.trusted.packages": "com.examples.kafkastreams"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this package?

value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
properties:
"spring.json.trusted.packages": "com.examples.kafkastreams"
producer:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have a producer?

implementation("org.springframework.boot:spring-boot-starter")
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
implementation("com.squareup.okhttp3:okhttp:4.9.0")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spring has a nice http class called RestTemplate you could have used

val fileWriter = FileWriter(file, true)

try {
fileWriter.append("Kafka Stream Results")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor but CSV files dont generally have title lines. Usually starts with the header line you have below.

Surprised Excel doesn't kick off about it

val avgWriteDuration = getAvgWriteDuration()
val readDuration = getReadAllDuration()
appendToCSV(file, 1, readDuration)
println("done")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have kotlin-logging in this project but dont seem to use it



fun appendToCSV(file: File, avgWriteDuration: Long, readDuration: Long) {
// val resultsFile = Paths.get("poller/src/main/resources/results/$filename").toFile()
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused code

try {
fileWriter.append("\n")
fileWriter.append("${avgWriteDuration},${readDuration}")
println("Write CSV successfully!")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrote

bootstrap-servers: localhost:9093
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
# org.springframework.kafka.support.serializer.JsonSerializer
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants