-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PR Final to Master #6
base: Final
Are you sure you want to change the base?
Conversation
Added info to readme
Kafka streams
Working Mongo MVP -> need to tidy code to be similar on both sides
Message format
Adding poller for results,, updated diagram
README.md
Outdated
@@ -2,11 +2,80 @@ | |||
|
|||
Dissertation project for Sheffield Hallam University MSc in Digital & Technology Solutions | |||
|
|||
#### Overview | |||
## Overview | |||
|
|||
The purpose of this application is to compare the read and write performances of | |||
a Kafka Streams backed State store against a MongoDB based application. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd mention here that you are using spring boot as that could heavily influence your results.
Make it a test between Spring Kafka Streams, & Spring MongoDB
README.md
Outdated
The setup script will create our data set and ideally we only want to run the script once to avoid extra work. | ||
|
||
To enable this, we have provided two shell scripts that run as pre-launch configurations for both the MongoDB application, and the Kafka streams version. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As mentioned on slack we
. From experience some reviewers can be picky about I/you/he/she/we/they
Would just write "to enable this, there are two shell scripts..."
README.md
Outdated
2. run the application through your IDEs run functionality. | ||
|
||
The application will the process all the messages in the `message-topic` kafka topic. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The application will the
Typo
docker-compose.yaml
Outdated
|
||
broker: | ||
image: wurstmeister/kafka:2.12-2.3.0 | ||
# image: confluentinc/cp-kafka:5.4.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused code
|
||
import java.time.Instant | ||
|
||
data class TimingsSummary ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason these are nullable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's because when I call val min = allTimings.minOrNull()
in the MessageController. Originally I tried min()
but it told me it's deprecated and I should use minOrNull()
. I tried to look around it but couldn't find anything obvious
import org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory | ||
|
||
@Configuration | ||
class SpringMongoConfig ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can do all this in your application.yaml
if you want
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spring should create the beans below for you by default I believe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do you mean? I thought I needed to use the @Value
to bring in the properties from application.yaml
and then create the classes? Is it with a certain annotation?
@KafkaListener(id = "mongo-consumer", topics = ["message-topic"]) | ||
fun readMessagesFromKafka(message: String) { | ||
logger.info { "Receiving message: $message"} | ||
messageRepository.placeMessageToDB(message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
save
?
interface MessageRepositoryInterface : MongoRepository<Message, String> {} | ||
|
||
@Repository | ||
class MessageRepository( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is more of a MessageService
to me
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would then enable you to call the interface above MessageRepository
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer | ||
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer | ||
properties: | ||
"spring.json.trusted.packages": "com.examples.kafkastreams" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this package?
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer | ||
properties: | ||
"spring.json.trusted.packages": "com.examples.kafkastreams" | ||
producer: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you have a producer
?
implementation("org.springframework.boot:spring-boot-starter") | ||
implementation("org.jetbrains.kotlin:kotlin-reflect") | ||
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8") | ||
implementation("com.squareup.okhttp3:okhttp:4.9.0") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spring has a nice http class called RestTemplate
you could have used
val fileWriter = FileWriter(file, true) | ||
|
||
try { | ||
fileWriter.append("Kafka Stream Results") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor but CSV files dont generally have title lines. Usually starts with the header line you have below.
Surprised Excel doesn't kick off about it
val avgWriteDuration = getAvgWriteDuration() | ||
val readDuration = getReadAllDuration() | ||
appendToCSV(file, 1, readDuration) | ||
println("done") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You have kotlin-logging in this project but dont seem to use it
|
||
|
||
fun appendToCSV(file: File, avgWriteDuration: Long, readDuration: Long) { | ||
// val resultsFile = Paths.get("poller/src/main/resources/results/$filename").toFile() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused code
try { | ||
fileWriter.append("\n") | ||
fileWriter.append("${avgWriteDuration},${readDuration}") | ||
println("Write CSV successfully!") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wrote
bootstrap-servers: localhost:9093 | ||
key-serializer: org.apache.kafka.common.serialization.StringSerializer | ||
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer | ||
# org.springframework.kafka.support.serializer.JsonSerializer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused
Kafka streams testing
MongoDB Unit Testing
Code coverage
Avg write duration
GitHub actions
No description provided.