Support-vector machines are supervised learning models with associated learning algorithms that analyze data.
Maximize the margin Find hyperplane satisfying
rewritten as
In addition to performing linear classification, SVMs can efficiently perform a non-linear classification using what is called the kernel trick, implicitly mapping their inputs into high-dimensional feature spaces.
Could imagine using "kernel" to twist the plane, so that there is a linear cutting that could maximize the margin
see also: SVM with polynomial kernel visualization
it is the most used non-linear kernel, because the number of parameters is small hence faster speed with good result
Simply using the regression result line as classification boundary