Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return weights for every sample? #150

Open
antithing opened this issue May 15, 2018 · 2 comments
Open

Return weights for every sample? #150

antithing opened this issue May 15, 2018 · 2 comments

Comments

@antithing
Copy link

antithing commented May 15, 2018

I have a model trained, and am using the following to return the likelihood:


		UINT predictedClassLabel = bag.getPredictedClassLabel();
		VectorFloat classLikelihoods = bag.getClassLikelihoods();
		VectorFloat classDistances = bag.getClassDistances();

		std::cout << " ClassLikelihoods: ";
		for (UINT j = 0; j<classLikelihoods.size(); j++) {
			std::cout << classLikelihoods[j] << " ";
		}
		std::cout << " ClassDistances: ";
		for (UINT j = 0; j<classLikelihoods.size(); j++) {
			std::cout << classDistances[j] << " ";
		}

This is working great, giving me, for example:

PredictedClassLabel: 6 ClassLikelihoods: 0 0 0 0 0 1 0 0 0 0 0  
ClassDistances: 0 0 0 0 0 0.390623 0 0 0 0 0

The classification is correct, But, based on the data I have given it, i would expect the distance of some other samples to be non zero also. Is it possible to return the'weights' or distance for every class, not just the classified class?

Or should this be happening already?

Thanks!

@antithing
Copy link
Author

antithing commented May 15, 2018

.. if I use a pipeline like this:

bool loadResult = trainingData.load("TrainingData.grt");

	//Print out some stats about the training data
	trainingData.printStats();
	
	//Create a new Gesture Recognition Pipeline using an Adaptive Naive Bayes Classifier
	GestureRecognitionPipeline pipeline;
	pipeline.setClassifier(ANBC());


	
	//Train the pipeline using the training data
	if (!pipeline.train(trainingData)) {
		std::cout << "ERROR: Failed to train the pipeline!\n";
		return EXIT_FAILURE;
	}

	//You can then get then get the accuracy of how well the pipeline performed during the k-fold cross validation testing
	double accuracy = pipeline.getCrossValidationAccuracy();

///////////////////////////////
	//Perform the prediction	
	
	bool predictionSuccess = pipeline.predict(inputVector);

	//Get the predicted class label
	UINT predictedClassLabel = pipeline.getPredictedClassLabel();
	VectorFloat classLikelihoods = pipeline.getClassLikelihoods();
	VectorFloat classDistances = pipeline.getClassDistances();

	std::cout << " PredictedClassLabel: " << predictedClassLabel;

	std::cout << " ClassLikelihoods: ";
	for (UINT j = 0; j<classLikelihoods.size(); j++) {
		std::cout << classLikelihoods[j] << " ";
	}
	std::cout << " ClassDistances: ";
	for (UINT j = 0; j<classDistances.size(); j++) {
		std::cout << classDistances[j] / 100 << " ";
	}

It gives me distances of:


 PredictedClassLabel: 4 ClassLikelihoods: 0 0 0 1 0 0 0 0 0 0 0 0  
ClassDistances: -38.3431 -52.1119 -59.9752 -6.21665 -42.6632 -48.6468 -inf -54.2848 -44.8232
-80.4233 -42.3163 -22.5498

where the lowest distance is the correct class. Can I use these numbers as 'weights' for each class somehow? Ideally I need a 0 - 1 value for each.

thanks again.

@antithing
Copy link
Author

... Ah, switching to SVM is working for me. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant