-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement simple word generator-aggregator model on REEF #1
Comments
As a start, I'll implement a simple word counter with one task as mentioned in ISSUE #4 . After that, I'll try to separate word generator and aggregator as two or three tasks. |
To separate the word generator and aggregator task, I think we'd better implement word generator and aggregator in different REEF jobs and run those jobs in separate machines. We can use RemoteManager for the network communication. Because RemoteManager provides low-level socket-based API, we can make a connection easily if we have addresses of word generator and aggregator. It handles data using Wake EventHandler, so we can also easily implement receiving data process inside the EventHandler. The main aggregator task thread will sleep all the time, and EventHandler inside it will wake up and handle the data whenever it comes in. Word generator and aggregator should use same codec to be used for serialization / deserialization of the data. Below is the overall structure of the model. |
@DifferentSC @TaeHunKim How about using |
I think |
I made an example about how to use |
@DifferentSC We don't write dirty code. ;-) |
Thanks to @DifferentSC, I success to implement word counter with one generator and one aggregator. |
I finished the basic experiment. I'll extend the experiment in other issued |
@TaeHunKim Is it ok to close this issue? If yes, please close it. |
@taegeonum I forgot to close it. Thank you. I'm going to close it. |
As the first step of MIST project, we need to implement a simple word generator-aggregator stream processing application on REEF. This issue is important for two reasons below.
The text was updated successfully, but these errors were encountered: