Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 6587b2a
Showing
33 changed files
with
3,199 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
Package: evclass | ||
Type: Package | ||
Title: Evidential Distance-Based Classification | ||
Version: 1.0.1 | ||
Date: 2016-06-21 | ||
Author: Thierry Denoeux | ||
Maintainer: Thierry Denoeux <tdenoeux@utc.fr> | ||
Description: Different evidential distance-based classifiers, which provide | ||
outputs in the form of Dempster-Shafer mass functions. The methods are: the | ||
evidential K-nearest neighbor rule and the evidential neural network. | ||
License: GPL-3 | ||
Depends: R (>= 3.1.0) | ||
Imports: FNN | ||
LazyData: TRUE | ||
RoxygenNote: 5.0.1 | ||
NeedsCompilation: no | ||
Packaged: 2016-06-21 09:49:58 UTC; Thierry | ||
Repository: CRAN | ||
Date/Publication: 2016-06-21 17:42:52 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
796e0a2c90a6bd52ddb6912593f0598f *DESCRIPTION | ||
6dfd314ed4c8aaa318cb72b37c1bdd9d *NAMESPACE | ||
bcd052b70a02a0e61c6457b927c334c8 *R/EkNNfit.R | ||
a0a4669c272eff4f63a0bdcacafe84e7 *R/EkNNinit.R | ||
cc918301931ef0dfa197a2f3d791b3d0 *R/EkNNval.R | ||
52a431a233d5075e603d3b52f9c9782c *R/classds.R | ||
115d774e5bce42f17e4c44ba20b1c5e2 *R/evclass.R | ||
6f31c165a0d7a2a3e86c41b4e697ed38 *R/foncgradr2n.R | ||
aeaad4cb6952aa9e16d979f5b01f0950 *R/glass-data.R | ||
2980f6f9293ffb4e0dc4d828494c1c99 *R/gradientds.R | ||
20a8bb25740f8f2445eda7ea0460ec20 *R/harris.R | ||
7c919b97eb312a99911bba45aa026664 *R/ionosphere-data.R | ||
848b2e2c11ce455a332dd8b0e7306f75 *R/optimds.R | ||
7535b780d2fd7f9fc526016f40975819 *R/proDSfit.R | ||
4523d06f8b0b94024d8d57b9819c9aac *R/proDSinit.R | ||
cf689fe5918edf79efcfb88395fcbd20 *R/proDSval.R | ||
a988fc277f3bfdb7f16cd8807d7bec4e *R/vehicles-data.R | ||
988e27d0f83a7ff327e4f38f089aeefd *data/glass.RData | ||
b7fa49999b8679c81898cc5262ee0c81 *data/ionosphere.RData | ||
1665c19c5134e390d23a6f938557c3b5 *data/vehicles.RData | ||
692b0234dd3ae533733ef4382a3109b6 *man/EkNNfit.Rd | ||
df9e6f51bbe7a511a71c62669b0369e9 *man/EkNNinit.Rd | ||
5c6c7e59a33b0c199ea37c15e93b6f1a *man/EkNNval.Rd | ||
cf56f366ae4cec7a5916b3be9c7313c5 *man/evclass.Rd | ||
ecf423e10f75e6dab5a7ced8a253575b *man/glass.Rd | ||
144a8069c347be152abc7543780db1c2 *man/ionosphere.Rd | ||
4f20dd9c17a3c66b6eda7e1c24012814 *man/proDSfit.Rd | ||
19dc1316032dc586a3b83fd4d9b31945 *man/proDSinit.Rd | ||
68c05de616fa459bba567ce123daebed *man/proDSval.Rd | ||
6d00db39a18df54421981daa1b761cdc *man/vehicles.Rd | ||
15c64d602e7264b11a8ca115a92a62f8 *vignettes/Introduction.Rmd | ||
7a59c365f0124c990e37d0f2e8c9768c *vignettes/tdenoeux.bib |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
# Generated by roxygen2: do not edit by hand | ||
|
||
export(EkNNfit) | ||
export(EkNNinit) | ||
export(EkNNval) | ||
export(proDSfit) | ||
export(proDSinit) | ||
export(proDSval) | ||
import(FNN) | ||
importFrom(stats,dist) | ||
importFrom(stats,kmeans) | ||
importFrom(stats,runif) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,66 @@ | ||
#' Training of the EkNN classifier | ||
#' | ||
#'\code{EkNNfit} optimizes the parameters of the EkNN classifier. | ||
#' | ||
#'If the argument \code{param} is not supplied, the function \code{\link{EkNNinit}} is called. | ||
#' | ||
#' @param x Input matrix of size n x d, where n is the number of objects and d the number of | ||
#' attributes. | ||
#' @param y Vector of class labels (of length n). May be a factor, or a vector of | ||
#' integers. | ||
#' @param K Number of neighbors. | ||
#' @param param Initial parameters (default: NULL). | ||
#' @param alpha Parameter \eqn{\alpha} (default: 0.95) | ||
#' @param lambda Parameter of the cost function. If \code{lambda=1}, the | ||
#' cost function measures the error between the plausibilities and the 0-1 target values. | ||
#' If \code{lambda=1/M}, where M is the number of classes (default), the piginistic probabilities | ||
#' are considered in the cost function. If \code{lambda=0}, the beliefs are used. | ||
#' @param optimize Boolean. If TRUE (default), the parameters are optimized. | ||
#' @param options A list of parameters for the optimization algorithm: maxiter | ||
#' (maximum number of iterations), eta (initial step of gradient variation), | ||
#' gain_min (minimum gain in the optimisation loop), disp (Boolean; if TRUE, intermediate | ||
#' results are displayed during the optimization). | ||
#' | ||
#' @return A list with five elements: | ||
#' \describe{ | ||
#' \item{param}{The optimized parameters.} | ||
#' \item{cost}{Final value of the cost function.} | ||
#' \item{err}{Leave-one-out error rate.} | ||
#' \item{ypred}{Leave-one-out predicted class labels.} | ||
#' \item{m}{Leave-one-out predicted mass functions. The first M columns correspond | ||
#' to the mass assigned to each class. The last column corresponds to the mass | ||
#' assigned to the whole set of classes.} | ||
#' } | ||
#' | ||
#'@references T. Denoeux. A k-nearest neighbor classification rule based on Dempster-Shafer | ||
#'theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05):804--813, 1995. | ||
#' | ||
#' L. M. Zouhal and T. Denoeux. An evidence-theoretic k-NN rule with parameter | ||
#' optimization. IEEE Transactions on Systems, Man and Cybernetics Part C, | ||
#' 28(2):263--271,1998. | ||
#' | ||
#'Available from \url{https://www.hds.utc.fr/~tdenoeux}. | ||
#' | ||
#'@author Thierry Denoeux. | ||
#' | ||
#' @export | ||
#' @import FNN | ||
#' | ||
#' @seealso \code{\link{EkNNinit}}, \code{\link{EkNNval}} | ||
#' | ||
#' @examples ## Iris dataset | ||
#' data(iris) | ||
#' x<-iris[,1:4] | ||
#' y<-iris[,5] | ||
#' fit<-EkNNfit(x,y,K=5) | ||
EkNNfit<-function(x,y,K,param=NULL,alpha=0.95,lambda=1/max(as.numeric(y)),optimize=TRUE, | ||
options=list(maxiter=300,eta=0.1,gain_min=1e-6,disp=TRUE)){ | ||
y<-as.integer(y) | ||
x<-as.matrix(x) | ||
if(is.null(param)) param<-EkNNinit(x,y,alpha) | ||
knn<-get.knn(x,k=K) | ||
knn$nn.dist<-knn$nn.dist^2 | ||
if(optimize) opt<-optimds(x,y,param,knn,K,lambda,options) | ||
class <- classds(opt$param,knn,y,K) | ||
return(list(param=opt$param,cost=opt$cost,err=class$err,ypred=class$ypred,m=class$m)) | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,54 @@ | ||
#' Initialization of parameters for the EkNN classifier | ||
#' | ||
#'\code{EkNNinit} returns initial parameter values for the EkNN classifier. | ||
#' | ||
#'Each parameter \eqn{\gamma_k} is set ot the inverse of the square root of the mean | ||
#'Euclidean distances wihin class k. Note that \eqn{\gamma_k} here is the square root | ||
#'of the \eqn{\gamma_k} as defined in (Zouhal and Denoeux, 1998). By default, parameter alpha is set | ||
#'to 0.95. This value normally does not have to be changed. | ||
#' | ||
#' | ||
#' @param x Input matrix of size n x d, where n is the number of objects and d the number of | ||
#' attributes. | ||
#' @param y Vector of class lables (of length n). May be a factor, or a vector of | ||
#' integers. | ||
#' @param alpha Parameter \eqn{\alpha}. | ||
#' | ||
#' @return A list with two elements: | ||
#' \describe{ | ||
#' \item{gamma}{Vector of parameters \eqn{\gamma_k}, of length c, the number of classes.} | ||
#' \item{alpha}{Parameter \eqn{\alpha}, set to 0.95.} | ||
#' } | ||
#' | ||
#'@references T. Denoeux. A k-nearest neighbor classification rule based on Dempster-Shafer | ||
#'theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05):804--813, 1995. | ||
#' | ||
#' L. M. Zouhal and T. Denoeux. An evidence-theoretic k-NN rule with parameter | ||
#' optimization. IEEE Transactions on Systems, Man and Cybernetics Part C, | ||
#' 28(2):263--271,1998. | ||
#' | ||
#'Available from \url{https://www.hds.utc.fr/~tdenoeux}. | ||
#' | ||
#'@author Thierry Denoeux. | ||
#' | ||
#' @export | ||
#' @importFrom stats dist | ||
#' | ||
#' @seealso \code{\link{EkNNfit}}, \code{\link{EkNNval}} | ||
#' | ||
#' @examples ## Iris dataset | ||
#' data(iris) | ||
#' x<-iris[,1:4] | ||
#' y<-iris[,5] | ||
#' param<-EkNNinit(x,y) | ||
#' param | ||
EkNNinit<-function(x,y,alpha=0.95){ | ||
y<-as.numeric(y) | ||
M<-max(y) | ||
gamm<-rep(0,M) | ||
for(k in 1:M){ | ||
D<-dist(x[y==k,]) | ||
gamm[k]<-1/sqrt(mean(D)) | ||
} | ||
return(list(gamma=gamm,alpha=alpha)) | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,94 @@ | ||
#' Classification of a test set by the EkNN classifier | ||
#' | ||
#'\code{EkNNval} classifies instances in a test set using the EkNN classifier. | ||
#' | ||
#' If class labels for the test set are provided, the test error rate is also returned. | ||
#' If parameters are not supplied, they are given default values by \code{\link{EkNNinit}}. | ||
#' | ||
#' | ||
#' @param xtrain Matrix of size ntrain x d, containing the values of the d attributes for the | ||
#' training data. | ||
#' @param ytrain Vector of class labels for the training data (of length ntrain). May | ||
#' be a factor, or a vector of integers. | ||
#' @param xtst Matrix of size ntst x d, containing the values of the d attributes for the | ||
#' test data. | ||
#' @param K Number of neighbors. | ||
#' @param ytst Vector of class labels for the test data (optional). May | ||
#' be a factor, or a vector of integers. | ||
#' @param param Parameters, as returned by \code{\link{EkNNfit}}. | ||
#' | ||
#' @return A list with three elements: | ||
#' \describe{ | ||
#' \item{m}{Predicted mass functions for the test data. The first M columns correspond | ||
#' to the mass assigned to each class. The last column corresponds to the mass | ||
#' assigned to the whole set of classes.} | ||
#' \item{ypred}{Predicted class labels for the test data.} | ||
#' \item{err}{Test error rate.} | ||
#' } | ||
#' | ||
#' | ||
#'@references T. Denoeux. A k-nearest neighbor classification rule based on Dempster-Shafer | ||
#'theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05):804--813, 1995. | ||
#' | ||
#' L. M. Zouhal and T. Denoeux. An evidence-theoretic k-NN rule with parameter | ||
#' optimization. IEEE Transactions on Systems, Man and Cybernetics Part C, | ||
#' 28(2):263--271,1998. | ||
#' | ||
#'Available from \url{https://www.hds.utc.fr/~tdenoeux}. | ||
#' | ||
#'@author Thierry Denoeux. | ||
#' | ||
#' @export | ||
#' @import FNN | ||
#' | ||
#' @seealso \code{\link{EkNNinit}}, \code{\link{EkNNfit}} | ||
#' | ||
#' @examples ## Iris dataset | ||
#' data(iris) | ||
#' train<-sample(150,100) | ||
#' xtrain<-iris[train,1:4] | ||
#' ytrain<-iris[train,5] | ||
#' xtst<-iris[-train,1:4] | ||
#' ytst<-iris[-train,5] | ||
#' K<-5 | ||
#' fit<-EkNNfit(xtrain,ytrain,K) | ||
#' test<-EkNNval(xtrain,ytrain,xtst,K,ytst,fit$param) | ||
EkNNval <- function(xtrain,ytrain,xtst,K,ytst=NULL,param=NULL){ | ||
|
||
ytrain<-as.numeric(ytrain) | ||
if(!is.null(ytst)) ytst<-as.numeric(ytst) | ||
|
||
if(is.null(param)) param<-EkNNinit(xtrain,ytrain) | ||
|
||
Napp<-nrow(xtrain) | ||
M<-max(ytrain) | ||
N<-nrow(xtst) | ||
|
||
knn<-get.knnx(xtrain, xtst, k=K) | ||
knn$nn.dist<-knn$nn.dist^2 | ||
is<-t(knn$nn.index) | ||
ds<-t(knn$nn.dist) | ||
|
||
m = rbind(matrix(0,M,N),rep(1,N)) | ||
|
||
|
||
for(i in 1:N){ | ||
for(j in 1:K){ | ||
m1 <- rep(0,M+1) | ||
m1[ytrain[is[j,i]]] <- param$alpha*exp(-param$gamma[ytrain[is[j,i]]]^2*ds[j,i]) | ||
m1[M+1] <- 1 - m1[ytrain[is[j,i]]] | ||
m[1:M,i] <- m1[1:M]*m[1:M,i] + m1[1:M]*m[M+1,i] + m[1:M,i]*m1[M+1] | ||
m[M+1,i] <- m1[M+1] * m[M+1,i] | ||
m<-m/matrix(colSums(m),M+1,N,byrow=TRUE) | ||
} | ||
} | ||
m<-t(m) | ||
ypred<-max.col(m[,1:M]) | ||
if(!is.null(ytst)) err<-length(which(ypred != ytst))/N else err<-NULL | ||
|
||
return(list(m=m,ypred=ypred,err=err)) | ||
|
||
} | ||
|
||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,35 @@ | ||
classds<-function(param,knn,y,K){ | ||
N<-length(y) | ||
M=max(y) | ||
L<-rep(0,N) | ||
mk <- rbind(matrix(0,M,N),rep(1,N)) | ||
is<-t(knn$nn.index) | ||
ds<-t(knn$nn.dist) | ||
|
||
for(k in 1:K){ | ||
Is <- is[k,] | ||
Is = y[Is] | ||
Tk <- matrix(0,M,N) | ||
for(j in 1:M){ | ||
pos <- which(Is==j) | ||
if(length(pos) != 0) Tk[j,pos] <- rep(1,length(pos)) | ||
} | ||
G <- matrix(param$gamm^2,M,N) * Tk | ||
gam <- apply(G,2,max) | ||
s <- param$alpha*exp(-gam *ds[k,]) | ||
m <- rbind(Tk*matrix(s,M,N,byrow=TRUE),1-s) | ||
mk <- rbind( mk[1:M,]*(m[1:M,]+matrix(m[M+1,],M,N,byrow=TRUE))+ | ||
m[1:M,]*matrix(mk[M+1,],M,N,byrow=TRUE),mk[M+1,]*m[M+1,]) | ||
Kn <- colSums(mk) | ||
mk <- mk/ matrix(Kn,M+1,N,byrow=TRUE) | ||
} | ||
mk<-t(mk) | ||
L<-max.col(mk[,1:M]) | ||
err=length(which(L != y))/N | ||
return(list(m=mk,ypred=L,err=err)) | ||
} | ||
|
||
|
||
|
||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
#' evclass: A package for evidential classification | ||
#' | ||
#' The evclass package currently contains functions for two evidential classifiers: the evidential | ||
#' K-nearest neighbor (EK-NN) rule (Denoeux, 1995; Zouhal and Denoeux, 1998) and the evidential | ||
#' neural network (Denoeux, 2000). In contrast with classical statistical classifiers, evidential | ||
#' classifier quantify the uncertainty of the classification using Dempster-Shafer mass functions. | ||
#' | ||
#' The main functions are: \code{\link{EkNNinit}}, \code{\link{EkNNfit}} and \code{\link{EkNNval}} | ||
#' for the initialization, training and evaluation of the EK-NN classifier, and | ||
#' \code{\link{proDSinit}}, \code{\link{proDSfit}} and \code{\link{proDSval}} for the | ||
#' evidential neural network classifier. | ||
#' | ||
#' @docType package | ||
#' @name evclass | ||
#' | ||
#' @seealso \code{\link{EkNNinit}}, \code{\link{EkNNfit}}, | ||
#'\code{\link{EkNNval}}, \code{\link{proDSinit}}, \code{\link{proDSfit}}, \code{\link{proDSval}}. | ||
#' | ||
#' @references | ||
#'T. Denoeux. A k-nearest neighbor classification rule based on Dempster-Shafer | ||
#'theory. IEEE Transactions on Systems, Man and Cybernetics, 25(05):804--813, 1995. | ||
#' | ||
#'T. Denoeux. A neural network classifier based on Dempster-Shafer theory. | ||
#'IEEE Trans. on Systems, Man and Cybernetics A, 30(2):131--150, 2000. | ||
#' | ||
#' L. M. Zouhal and T. Denoeux. An evidence-theoretic k-NN rule with parameter | ||
#' optimization. IEEE Transactions on Systems, Man and Cybernetics Part C, | ||
#' 28(2):263--271,1998. | ||
#' | ||
#'Available from \url{https://www.hds.utc.fr/~tdenoeux}. | ||
#' | ||
NULL |
Oops, something went wrong.