Skip to content

EducationalTestingService/skll

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
December 19, 2017 15:37
January 19, 2023 11:58
doc
January 19, 2023 11:58
September 13, 2022 14:03
January 19, 2023 11:58
September 13, 2022 14:03
January 18, 2023 13:45
September 16, 2019 14:03
September 14, 2022 16:47
December 8, 2021 09:43
December 20, 2021 18:03
January 19, 2023 11:58
December 20, 2021 18:12

SciKit-Learn Laboratory

Gitlab CI status Azure Pipelines status Latest version on PyPI

License

Conda package for SKLL Supported python versions for SKLL DOI for citing SKLL 1.0.0

This Python package provides command-line utilities to make it easier to run machine learning experiments with scikit-learn. One of the primary goals of our project is to make it so that you can run scikit-learn experiments without actually needing to write any code other than what you used to generate/extract the features.

Installation

You can install using either pip or conda. See details here.

Requirements

Command-line Interface

The main utility we provide is called run_experiment and it can be used to easily run a series of learners on datasets specified in a configuration file like:

[General]
experiment_name = Titanic_Evaluate_Tuned
# valid tasks: cross_validate, evaluate, predict, train
task = evaluate

[Input]
# these directories could also be absolute paths
# (and must be if you're not running things in local mode)
train_directory = train
test_directory = dev
# Can specify multiple sets of feature files that are merged together automatically
featuresets = [["family.csv", "misc.csv", "socioeconomic.csv", "vitals.csv"]]
# List of scikit-learn learners to use
learners = ["RandomForestClassifier", "DecisionTreeClassifier", "SVC", "MultinomialNB"]
# Column in CSV containing labels to predict
label_col = Survived
# Column in CSV containing instance IDs (if any)
id_col = PassengerId

[Tuning]
# Should we tune parameters of all learners by searching provided parameter grids?
grid_search = true
# Function to maximize when performing grid search
objectives = ['accuracy']

[Output]
# Also compute the area under the ROC curve as an additional metric
metrics = ['roc_auc']
# The following can also be absolute paths
logs = output
results = output
predictions = output
probability = true
models = output

For more information about getting started with run_experiment, please check out our tutorial, or our config file specs.

You can also follow this interactive Jupyter tutorial.

We also provide utilities for:

Python API

If you just want to avoid writing a lot of boilerplate learning code, you can also use our simple Python API which also supports pandas DataFrames. The main way you'll want to use the API is through the Learner and Reader classes. For more details on our API, see the documentation.

While our API can be broadly useful, it should be noted that the command-line utilities are intended as the primary way of using SKLL. The API is just a nice side-effect of our developing the utilities.

A Note on Pronunciation

SKLL logo

doc/spacer.png

SciKit-Learn Laboratory (SKLL) is pronounced "skull": that's where the learning happens.

Talks

  • Simpler Machine Learning with SKLL 1.0, Dan Blanchard, PyData NYC 2014 (video | slides)
  • Simpler Machine Learning with SKLL, Dan Blanchard, PyData NYC 2013 (video | slides)

Citing

If you are using SKLL in your work, you can cite it as follows: "We used scikit-learn (Pedragosa et al, 2011) via the SKLL toolkit (https://github.com/EducationalTestingService/skll)."

Books

SKLL is featured in Data Science at the Command Line by Jeroen Janssens.

Changelog

See GitHub releases.

Contribute

Thank you for your interest in contributing to SKLL! See CONTRIBUTING.md for instructions on how to get started.