Skip to content
Rumale is a machine learning library in Ruby
Ruby C
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
bin 🚀 Rename to Rumale Mar 1, 2019
ext/rumale 📚 Fix internal document Aug 24, 2019
lib 🆙 Bump up version number to 0.13.6 Oct 13, 2019
spec 🎨 Fix max_iter parameter value of TSNE on spec Oct 13, 2019
.coveralls.yml 🔧 add coveralls config file Feb 25, 2018
.gitignore 🔧 Fix gitignore file Jun 1, 2019
.rubocop.yml 🚨 Autocorrect specs with RuboCop RSpec Oct 13, 2019
.travis.yml Introduce Numo::Linalg Aug 24, 2019 🆙 Bump up version number to 0.13.6 Oct 13, 2019
Gemfile 🚀 Rename to Rumale Mar 1, 2019
LICENSE.txt 🎨 Fix license text Feb 18, 2019 Update Oct 10, 2019
Rakefile Introduce Ruby extension Apr 20, 2019
rumale.gemspec 🔧 Fix gemspec file Sep 28, 2019



Build Status Coverage Status Gem Version BSD 2-Clause License Documentation

Rumale (Ruby machine learning) is a machine learning library in Ruby. Rumale provides machine learning algorithms with interfaces similar to Scikit-Learn in Python. Rumale supports Linear / Kernel Support Vector Machine, Logistic Regression, Linear Regression, Ridge, Lasso, Kernel Ridge, Factorization Machine, Naive Bayes, Decision Tree, AdaBoost, Gradient Tree Boosting, Random Forest, Extra-Trees, K-nearest neighbor classifier, K-Means, K-Medoids, Gaussian Mixture Model, DBSCAN, HDBSCAN, SNN, Spectral Clustering, Power Iteration Clustering, Mutidimensional Scaling, t-SNE, Principal Component Analysis, Kernel PCA and Non-negative Matrix Factorization.

This project was formerly known as "SVMKit". If you are using SVMKit, please install Rumale and replace SVMKit constants with Rumale.


Add this line to your application's Gemfile:

gem 'rumale'

And then execute:

$ bundle

Or install it yourself as:

$ gem install rumale


Example 1. XOR data

First, let's classify simple xor data. In Rumale, feature vectors and labels are represented by Numo::NArray.

require 'rumale'

# Prepare XOR data.
features = [[0, 0], [0, 1], [1, 0], [1, 1]]
labels = [0, 1, 1, 0]

# Convert Ruby Array into Numo::NArray.
x = Numo::DFloat.asarray(features)
y = Numo::Int32.asarray(labels)

# Train classifier with nearest neighbor rule.
estimator = 1), y)

# Predict labels.
p y
p estimator.predict(x)

Execution of the above script result in the following.

[0, 1, 1, 0]
[0, 1, 1, 0]

The basic usage of Rumale is to first train the model with the fit method and then estimate with the predict method.

Example 2. Pendigits dataset classification

Rumale provides function loading libsvm format dataset file. We start by downloading the pendigits dataset from LIBSVM Data web site.

$ wget
$ wget

Training of the classifier with Linear SVM and RBF kernel feature map is the following code.

require 'rumale'

# Load the training dataset.
samples, labels = Rumale::Dataset.load_libsvm_file('pendigits')

# Map training data to RBF kernel feature space.
transformer = 0.0001, n_components: 1024, random_seed: 1)
transformed = transformer.fit_transform(samples)

# Train linear SVM classifier.
classifier = 0.0001, max_iter: 1000, batch_size: 50, random_seed: 1), labels)

# Save the model.'transformer.dat', 'wb') { |f| f.write(Marshal.dump(transformer)) }'classifier.dat', 'wb') { |f| f.write(Marshal.dump(classifier)) }

Classifying testing data with the trained classifier is the following code.

require 'rumale'

# Load the testing dataset.
samples, labels = Rumale::Dataset.load_libsvm_file('pendigits.t')

# Load the model.
transformer = Marshal.load(File.binread('transformer.dat'))
classifier = Marshal.load(File.binread('classifier.dat'))

# Map testing data to RBF kernel feature space.
transformed = transformer.transform(samples)

# Classify the testing data and evaluate prediction results.
puts("Accuracy: %.1f%%" % (100.0 * classifier.score(transformed, labels)))

# Other evaluating approach
# results = classifier.predict(transformed)
# evaluator =
# puts("Accuracy: %.1f%%" % (100.0 * evaluator.score(results, labels)))

Execution of the above scripts result in the following.

$ ruby train.rb
$ ruby test.rb
Accuracy: 98.4%

Example 3. Cross-validation

require 'rumale'

# Load dataset.
samples, labels = Rumale::Dataset.load_libsvm_file('pendigits')

# Define the estimator to be evaluated.
lr = 0.0001, random_seed: 1)

# Define the evaluation measure, splitting strategy, and cross validation.
ev =
kf = 5, shuffle: true, random_seed: 1)
cv = lr, splitter: kf, evaluator: ev)

# Perform 5-cross validation.
report = cv.perform(samples, labels)

# Output result.
mean_logloss = report[:test_score].inject(:+) / kf.n_splits
puts("5-CV mean log-loss: %.3f" % mean_logloss)

Execution of the above scripts result in the following.

$ ruby cross_validation.rb
5-CV mean log-loss: 0.476

Example 4. Pipeline

require 'rumale'

# Load dataset.
samples, labels = Rumale::Dataset.load_libsvm_file('pendigits')

# Construct pipeline with kernel approximation and SVC.
rbf = 0.0001, n_components: 800, random_seed: 1)
svc = 0.0001, max_iter: 1000, random_seed: 1)
pipeline = { trns: rbf, clsf: svc })

# Define the splitting strategy and cross validation.
kf = 5, shuffle: true, random_seed: 1)
cv = pipeline, splitter: kf)

# Perform 5-cross validation.
report = cv.perform(samples, labels)

# Output result.
mean_accuracy = report[:test_score].inject(:+) / kf.n_splits
puts("5-CV mean accuracy: %.1f %%" % (mean_accuracy * 100.0))

Execution of the above scripts result in the following.

$ ruby pipeline.rb
5-CV mean accuracy: 99.2 %

Speeding up


Loading the Numo::Linalg allows to perform matrix product of Numo::NArray using BLAS libraries. For example, using the OpenBLAS speeds up many estimators in Rumale.

Install OpenBLAS library.


$ brew install openblas


$ sudo apt-get install gcc gfortran
$ wget
$ tar xzf v0.3.5.tar.gz
$ cd OpenBLAS-0.3.5
$ make USE_OPENMP=1
$ sudo make PREFIX=/usr/local install

Install Numo::Linalg gem.

$ gem install numo-linalg

In ruby script, you only need to require the autoloader module of Numo::Linalg.

require 'numo/linalg/autoloader'
require 'rumale'


Several estimators in Rumale support parallel processing. Parallel processing in Rumale is realized by Parallel gem, so install and load it.

$ gem install parallel
require 'parallel'
require 'rumale'

Estimators that support parallel processing have n_jobs parameter. When -1 is given to n_jobs parameter, all processors are used.

estimator = -1, random_seed: 1)


After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.

To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and tags, and push the .gem file to


Bug reports and pull requests are welcome on GitHub at This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.


The gem is available as open source under the terms of the BSD 2-clause License.

Code of Conduct

Everyone interacting in the Rumale project’s codebases, issue trackers, chat rooms and mailing lists is expected to follow the code of conduct.

You can’t perform that action at this time.