Skip to content

Public repo for the paper "Multi-Intention Aware Configuration Selection for Performance Tuning"

Notifications You must be signed in to change notification settings

TimHe95/SafeTune

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Artifacts of SafeTune

logo no shadow

This repository includes the artifacts (including data and source code) of the our paper: "Multi-Intention Aware Configuration Selection for Performance Tuning"

The repository includes the following artifacts:

  • dataset: Section 2. Understanding Side-effects
    • labeled dataset including 7,325 parameters from 13 software covering four software domains
    • labeled dataset including 735 parameters from PostgreSQL, Squid, Spark (in RQ1)
  • expansion: Section 3. Semi-supervised Data Expansion
    • source code
    • small-scaled labeled data (1,292 parameters) to be expanded
    • rules mined and new data expanded at each iteration
    • domain-specific synonym list retrived from the study in Section 2 and supplemented from wiki.
    • results including how many & how accurate are the new data
  • model: Section 4. Learning Based Model to Predict Tuning Guidance
    • source code
    • training data (24,528 pieces, obtained after expansion)
    • testing data (735 parameters from PostgreSQL, Squid, Spark)
  • comparing_existing: RQ2. Comparing SafeTune with State-of-art-tool
    • scripts and commands to validate the parameters missed by the existing work do have performance impacts
    • results including
      • performance impact of each parameter
      • FULL comparision between SafeTune and the existing work
      • all the inital testing results, by which we get the evaluation result in RQ2
  • case_study: RQ3. Effectiveness of SafeTune in Helping OtterTune
    • the other four cases that are not present in the paper due to the limited space

About

Public repo for the paper "Multi-Intention Aware Configuration Selection for Performance Tuning"

Topics

Resources

Stars

Watchers

Forks

Languages

  • Python 88.4%
  • Shell 8.2%
  • R 3.4%