Skip to content

Implementing of the SMO algorithm from scratch based on the original article only.

Notifications You must be signed in to change notification settings

itsikshteinberger/Sequential-Minimal-Optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Sequential Minimal Optimization algorithm (SMO)


Implementing the SMO algorithm from scratch


Sequential Minimal Optimizatiom algorithm [1] is a faster algorithm for training support vector machines designed by J. C. Platt.

Training a support vector machine requires the solution of a very large quadratic programming (QP) optimization problem.
SMO breaks this large QP problem into a series of smallest possible QP problems.
These small QP problems are solved analytically, which avoids using a time-consuming numerical QP optimization as an inner loop.
The amount of memory required for SMO is linear in the training set size, which allows SMO to handle very large training sets.
Because matrix computation is avoided, SMO scales somewhere between linear and quadratic in the training set size for various test problems, while the standard chunking SVM algorithm scales somewhere between linear and cubic in the training set size.
SMO’s computation time is dominated by SVM evaluation, hence SMO is fastest for linear SVMs and sparse data sets.
On realworld sparse data sets, SMO can be more than 1000 times faster than the chunking algorithm.



References:

[1] Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines, 1998.

Releases

No releases published

Packages

No packages published