davidpicard edited this page Feb 28, 2013 · 3 revisions

JKernelMachines provides a framework for multiclass classification. However, there is no real multiclass classifier yet. The only algorithm for multiclass classification is an aggregation of multiple one against all classifiers. It is available in the class OneAgainstAll. This means you should provide a binary classifier which will be duplicated and trained for all classes. The following example uses LaSVM as internal binary classifier:

	//build classifier based on GaussL2 lasvm with c=10
	DoubleGaussL2 k =new DoubleGaussL2(2.0);
	LaSVM<double[]> svm = new LaSVM<double[]>(k);
	OneAgainstAll<double[]> mcsvm = new OneAgainstAll<double[]>(svm);

Namely, a SVM using LaSVM algorithm with a Gaussian kernel is instanciated and setup, and used as classifier by the OneAgainstAll.

To produce multiclass data that fit JKernelMachines, you only need the TrainingSample's label to be equal to the class. This means you have to map symbolic classes onto integers. For example if you have three classes, namely 'car', 'horse' and 'train', a valid mapping would be {1, 2, 3}. A label of 1 means the example belongs to the 'car' category, a 2 means it is a 'horse', and so on. Any integer mapping is correct (i.e. not necessarily the n first integers).

The value returned by the multiclass classifier is the integer corresponding to the expected class. You can verify the correctness of the prediction using the following code:

	double v = multisvm.valueOf(t.sample);
	if(v != t.label) {
		debug.println(0,  "error : got "+v+" expected "+t.label);

To evaluate the multiclass accuracy, the MulticlassAccuracyEvaluator class is your friend. A useful example of multiclass training and evaluation is provided in fr.lip6.jkernelmachines.example.MulticlassExample.