Skip to content

dcatkth/readinggroup

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Course Title

Reading Course on Advanced Topics in Distributed Systems


The Course Content

This course is a graduate reading course that will cover the advanced topics in distributed systems, including but not limited to distributed learning, gossip-based learning, graph neural networks, and large scale graph processing. Every participant should find their own relevant research papers, read and analyze their contributions, give a presentation on the material and actively contribute to the group discussions, as well as write a short report on the selected papers. This course is given in the distributed computing group at KTH (DC@KTH).


Intended Learning Outcomes (ILO)

After the course the student will be able to discuss, analyze, present, and critically review the very latest research advancements in the areas of distributed systems and learning and make connections to knowledge in related fields. The student will also be able to assess and evaluate new emerging trends as well as to identify the need for further knowledge in the field.


Course Disposition

The course is organized as a reading course. Each student will be required to perform the following tasks:

  • Task 1: identity your relevant research literature under the topic of distirbuted systems, with a focus on distributed learning, gossip-based learning, graph neural networks, large scale graph processing, or similar. Scan the related literature and select three papers that you would like to review. It would be better that the papers tackle the same problem, or are related to the application of the same discipline/approach to different problems. The key point to observe is that the selected papers share some common ground based on which they could be compared against each other.

  • Task 2: write a short justification paragraph to explain your choice of the selected papers. Note that, at this stage, you are not required to read the papers in detail. The paragraph should mostly focus on why you are more interested in the selected focused topic, and how you think your selected papers relate to it (e.g., they address the same research question, they apply different approaches to the same problem).

  • Task 3: carefully read, analyze, and compare the selected papers to prepare an oral presentation. The presentation should not only present what is in the papers, but mostly contrast and compare their approaches, contributions, and shortcomings, possibly getting/giving insights on related future research. The presentation should be delivered during one of our regular seminar sessions.

  • Task 4: write a critical review of the papers that covers in particular the summary of contributions, solutions, significance, and technical/experimental quality.

  • Task 5: choose one of your peers presentations to oppose. You will need to read the papers as well and have a general understanding of their content, contributions, and possible noticed limitations. You have to attend the presentation of your opponent, and to take careful notes on how you perceived its quality, both in terms of content, suitability of the chosen papers and the links between them, and quality of presenting.

  • Task 6: deliver a written report reviewing your opponent’s work. The review should present objective arguments on what you think are the strengths and weaknesses of the opposed presentation. The report should clearly explain why or why not you think that the selected papers fit within the course’s topic, how the presentation has been fair to explaining the content of the papers, as well as what were the presentations strong points and possible shortcomings.

  • Task 7: a minimum of 75% attendance in seminars.

Papers and Schedule

November 4, 2019 - Zainab Abbas (opponent: Sana Imtiaz)

  • A Deep Learning Framework for Graph Partitioning [pdf]
  • Device Placement Optimization with Reinforcement Learning [pdf]
  • Streaming Graph Partitioning for Large Distributed Graphs [pdf]
  • [justification] [slides] [review]

November 11, 2019 - Klas Segeljakt (opponent: Max Meldrum)

  • Structured Streaming: A Declarative API for Real-Time Applications in Apache Spark [pdf]
  • State Management in Apache Flink [pdf]
  • Consistent Regions: Guaranteed Tuple Processing in IBM Streams [pdf]
  • [justification] [slides-pdf] [slides-pptx] [review]

November 18, 2019 - Sana Imtiaz (opponent: Zainab Abbas)

  • Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data [pdf]
  • SecureML: A System for Scalable Privacy-Preserving Machine Learning [pdf]
  • Chiron: Privacy-preserving Machine Learning as a Service [pdf]
  • [justification] [slides] [review]

November 25, 2019 - Moritz Meister (opponent: Stefanos Antaris)

  • BOHB: Robust and Efficient Hyperparameter Optimization at Scale [pdf]
  • Massively Parallel Hyperparameter Tuning [pdf]
  • Population Based Training of Neural Networks [pdf]
  • [justification] [slides] [review]

December 16, 2019 - Lodovico Giaretta (opponent: Negar Safinianaini)

  • Gated Graph Sequence Neural Networks [pdf]
  • Semi-Supervised Classification with Graph Convolutional Networks [pdf]
  • Inductive Representation Learning on Large Graphs [pdf]
  • [justification] [slides] [review]

January 29, 2020 - Tianze Wang (opponent: Sina Sheikholeslami)

  • Exploring Hidden Dimensions in Parallelizing Convolutional Neural Networks [pdf]
  • Beyond Data and Model Parallelism for Deep Neural Networks [pdf]
  • Priority-based Parameter Propagation for Distributed DNN Training [pdf]
  • [justification] [slides] [review]

February 5, 2020 - Max Meldrum (opponent: Klas Segeljakt)

  • MillWheel: Fault-Tolerant Stream Processing at Internet Scale [pdf]
  • Naiad: A Timely Dataflow System [pdf]
  • Ray: A Distributed Framework for Emerging AI Applications [pdf]
  • [justification] [slides] [review]

February 26, 2020 - Stefanos Antaris (opponent: Susanna Pozzoli)

  • Deep Decentralized Multi-task Multi-Agent Reinforcement Learning under Partial Observability [pdf]
  • Multi-Agent Adversarial Inverse Reinforcement Learning [pdf]
  • Fully Decentralized Multi-Agent Reinforcement Learning with Networked Agents [pdf]
  • [justification] [slides] [review]

March 4, 2020 - David Daharewa Gureya

  • CoPart: Coordinated Partitioning of Last-Level Cache and Memory Bandwidth for Fairness-Aware Workload Consolidation on Commodity Servers [pdf]
  • PARTIES: QoS-Aware Resource Partitioning for Multiple Interactive Services [pdf]
  • SWAP: Effective Fine-Grain Management of Shared Last-Level Caches with Minimum Hardware Support [pdf]
  • [justification] [slides] [review]

March 11, 2020 - Amir Hossein Rahnama (opponent: Tianze Wang)

  • This Looks Like That: Deep Learning for Interpretable Image Recognition [pdf]
  • Interpretable Image Recognition with Hierarchical Prototypes [pdf]
  • Deep Learning for Case-Based Reasoning through Prototypes: A Neural Network that Explains Its Predictions [pdf]
  • [justification] [slides] [review]

March 18, 2020 - Sina Sheikholeslami (opponent: Moritz Meister and David Daharewa Gureya)

  • Supporting Very Large Models using Automatic Dataflow Graph Partitioning [pdf]
  • Placeto: Learning Generalizable Device Placement Algorithms for Distributed Machine Learning [pdf]
  • GDP: Generalized Device Placement for Dataflow Graphs [pdf]
  • [justification] [slides] [review]

March 25, 2020 - Susanna Pozzoli (opponent: Lodovico Giaretta)

  • Frequent Subgraph Mining Based on Pregel [pdf]
  • Large-Scale Frequent Subgraph Mining in MapReduce [pdf]
  • Leveraging Multiple GPUs and CPUs for Graphlet Counting in Large Networks [pdf]
  • [justification] [slides] [review]

Contact

Contact Amir H. Payberah if you have any question.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published