Skip to content

Commit

Permalink
Version 1.0.4
Browse files Browse the repository at this point in the history
  • Loading branch information
Hiroyuki KASAI committed Jul 12, 2017
1 parent b704d92 commit 36ba9c0
Show file tree
Hide file tree
Showing 14 changed files with 1,311 additions and 1 deletion.
7 changes: 6 additions & 1 deletion README.md
Expand Up @@ -72,12 +72,16 @@ List of algorithms
- **LC-KSVD** (Label Consistent K-SVD)
- Z. Jiang, Z. Lin, L. S. Davis, "[Learning a discriminative dictionary for sparse coding via label consistent K-SVD](http://ieeexplore.ieee.org/abstract/document/5995354/)," IEEE Conference on Computer Vision and Pattern Recognition (CVPR2011), 2011.
- Z. Jiang, Z. Lin, L. S. Davis, "[Label consistent K-SVD: learning A discriminative dictionary for recognition](http://ieeexplore.ieee.org/document/6516503/)," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.35, no.11, pp.2651-2664, 2013.
- **FDDL** (Fisher Discriminative Dictionary Learning)
- M. Yang, L. Zhang, X. Feng, and D. Zhang, "[Fisher discrimination dictionary learning for sparse representation](http://ieeexplore.ieee.org/document/6126286/)," IEEE International Conference on Computer Vision (ICCV), 2011.
- **Geometry-aware**
- **R-KSRC (Stein kernel)** (Riemannian kernelized sparse representation classification)
- M. Harandi, R. Hartley, B. Lovell and C. Sanderson, "[Sparse coding on symmetric positive definite manifolds using bregman divergences](http://ieeexplore.ieee.org/document/7024121/)," IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2016.
- M. Harandi, C. Sanderson, R. Hartley and B. Lovell, "[Sparse coding and dictionary learning for symmetric positive definite matrices: a kernel approach](https://drive.google.com/uc?export=download&id=0B9_PW9TCpxT0eW00U1FVd0xaSmM)," European Conference on Computer Vision (ECCV), 2012.
-**R-KSRC (Log-Euclidean kernel)** (Riemannian kernelized sparse representation classification)
- P. Li, Q. Wang, W. Zuo, and L. Zhang, "[Log-Euclidean kernels for sparse representation and dictionary learning](http://ieeexplore.ieee.org/document/6751309/)," IEEE International Conference on Computer Vision (ICCV), 2013.
- S. Jayasumana, R. Hartley, M. Salzmann, H. Li, and M. Harandi, "[Kernel methods on the Riemannian manifold of symmetric positive definite matrices](http://ieeexplore.ieee.org/document/6618861/)," IEEE Conference on Computer Vision and Pattern Recognition (CVPR2013), 2013.
- S. Jayasumana, R. Hartley, M. Salzmann, H. Li, and M. Harandi, "[Kernel methods on the Riemannian manifold with Gaussian RBF Kernels](http://ieeexplore.ieee.org/document/7063231/)," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.37, no.12, 2015.
- [Reference] **R-KSRC (Deta-dependent kernel)** [Not included in this package]
- Y. Wu, Y. Jia, P. Li, J. Zhang, and J. Yuan, "[Manifold kernel sparse representation of symmetric positive definite matrices and its applications](http://ieeexplore.ieee.org/document/7145428/)," IEEE Transactions on Image Processing, vol.24, no.11, 2015.

Expand Down Expand Up @@ -199,10 +203,11 @@ Third party tools, libraries, and packages.
- [KSVDBox](http://www.cs.technion.ac.il/~ronrubin/Software/ksvdbox13.zip) is used for K-SVD algorithm.
- [SPAMS](http://spams-devel.gforge.inria.fr/downloads.html) is used for various lasso problems.
- [LC-KSVD](https://www.umiacs.umd.edu/~zhuolin/projectlcksvd.html).
- [FDDL](http://www4.comp.polyu.edu.hk/~cslzhang/code/FDDL.zip).
- [RSR](https://drive.google.com/uc?export=download&id=0B9_PW9TCpxT0ZVpGRDNLX3NCbXc).
- [Learning Discriminative Stein Kernel for SPD Matrices and Its Applications](https://github.com/seuzjj/DSK/archive/master.zip).
- [SDR-SLR](http://www3.ntu.edu.sg/home/EXDJiang/CodesPAMI2015.zip).
- [*R-KSRC (Log-Euclidean kernel)](http://www4.comp.polyu.edu.hk/~cslzhang/LogEKernel_Project/ICCV_LogEKernel_Code.zip).
- [R-KSRC (Log-Euclidean kernel)](http://www4.comp.polyu.edu.hk/~cslzhang/LogEKernel_Project/ICCV_LogEKernel_Code.zip).
- DERLR.
- [JACOBI_EIGENVALUE](https://people.sc.fsu.edu/~jburkardt/m_src/jacobi_eigenvalue/jacobi_eigenvalue.html) is a MATLAB library which computes the eigenvalues and eigenvectors of a real symmetric matrix.
- [NMFLibrary](https://github.com/hiroyuki-kasai/NMFLibrary) is for [NMF](https://en.wikipedia.org/wiki/Non-negative_matrix_factorization).
Expand Down
39 changes: 39 additions & 0 deletions lib/FDDL/Readme.txt
@@ -0,0 +1,39 @@
% ========================================================================
% Fisher Discriminative Dictionary Learning (FDDL), Version 1.0
% Copyright(c) 2011 Meng YANG, Lei Zhang, Xiangchu Feng and David Zhang
% All Rights Reserved.
%
% The code is for the paper:

% M. Yang, L. Zhang, X. Feng and D. Zhang,
% ��Fisher Discrimination Dictionary Learning for Sparse Representation,�� in ICCV 2011.

% ----------------------------------------------------------------------
% Permission to use, copy, or modify this software and its documentation
% for educational and research purposes only and without fee is here
% granted, provided that this copyright notice and the original authors'
% names appear on all copies and supporting documentation. This program
% shall not be used, rewritten, or adapted as the basis of a commercial
% software or hardware product without first obtaining permission of the
% authors. The authors make no representations about the suitability of
% this software for any purpose. It is provided "as is" without express
% or implied warranty.
%----------------------------------------------------------------------

demo.m Face recognition demo on AR database with 300-d Eigenface feature

utilier : folder of FDDL functions, including
Eigenface_f: function of computing PCA Projection Matrix
FDDL: main function of FDDL
FDDL_Class_Energy: function of computing energy of certain class
FDDL_FDL_Energy: function of computing energy of all classes
FDDL_Gradient_Comp: function of computing coding model's gradient
FDDL_INIC: function of initializing representation coef
FDDL_INID: function of initializing dictionary
FDDL_SpaCoef: function of computing coding coefficient
FDDL_UpdateDi: function of updating dictioary
IPM_SC: sparse coding function
soft: soft threholding function

%-------------------------------------------------------------------------
Contact: {csmyang,cslzhang}@comp.polyu.edu.hk
57 changes: 57 additions & 0 deletions lib/FDDL/demo.m
@@ -0,0 +1,57 @@
close all;
clear all;
clc;

addpath([cd '/utilies']);
load(['AR_EigenFace']);

%%%%%%%%%%%%%%%%%%%%%%%%
%FDDL parameter
%%%%%%%%%%%%%%%%%%%%%%%%
opts.nClass = 100;
opts.wayInit = 'PCA';
opts.lambda1 = 0.005;
opts.lambda2 = 0.05;
opts.nIter = 15;
opts.show = true;
[Dict,Drls,CoefM,CMlabel] = FDDL(tr_dat,trls,opts);

%%%%%%%%%%%%%%%%%%%%%%%%
% Sparse Classification
%%%%%%%%%%%%%%%%%%%%%%%%
lambda = 0.005;
nClass = opts.nClass;
weight = 0.5;

td1_ipts.D = Dict;
td1_ipts.tau1 = lambda;
if size(td1_ipts.D,1)>=size(td1_ipts.D,2)
td1_par.eigenv = eigs(td1_ipts.D'*td1_ipts.D,1);
else
td1_par.eigenv = eigs(td1_ipts.D*td1_ipts.D',1);
end

ID = [];
for indTest = 1:size(tt_dat,2)
fprintf(['Totalnum:' num2str(size(tt_dat,2)) 'Nowprocess:' num2str(indTest) '\n']);
td1_ipts.y = tt_dat(:,indTest);
[opts] = IPM_SC(td1_ipts,td1_par);
s = opts.x;

for indClass = 1:nClass
temp_s = zeros(size(s));
temp_s(indClass==Drls) = s(indClass==Drls);
zz = tt_dat(:,indTest)-td1_ipts.D*temp_s;
gap(indClass) = zz(:)'*zz(:);

mean_coef_c = CoefM(:,indClass);
gCoef3(indClass) = norm(s-mean_coef_c,2)^2;
end

wgap3 = gap + weight*gCoef3;
index3 = find(wgap3==min(wgap3));
id3 = index3(1);
ID = [ID id3];
end

fprintf('%s%8f\n','reco_rate = ',sum(ID==ttls)/(length(ttls)));
54 changes: 54 additions & 0 deletions lib/FDDL/utilies/Eigenface_f.m
@@ -0,0 +1,54 @@
function [disc_set,disc_value,Mean_Image]=Eigenface_f(Train_SET,Eigen_NUM)

% the magnitude of eigenvalues of this function is corrected right !!!!!!!!!
% Centralized PCA
[NN,Train_NUM]=size(Train_SET);

if NN<=Train_NUM % for small sample size case

Mean_Image=mean(Train_SET,2);
Train_SET=Train_SET-Mean_Image*ones(1,Train_NUM);
R=Train_SET*Train_SET'/(Train_NUM-1);

[V,S]=Find_K_Max_Eigen(R,Eigen_NUM);
disc_value=S;
disc_set=V;

else % for small sample size case

Mean_Image=mean(Train_SET,2);
Train_SET=Train_SET-Mean_Image*ones(1,Train_NUM);

R=Train_SET'*Train_SET/(Train_NUM-1);

[V,S]=Find_K_Max_Eigen(R,Eigen_NUM);
disc_value=S;
disc_set=zeros(NN,Eigen_NUM);

Train_SET=Train_SET/sqrt(Train_NUM-1);
for k=1:Eigen_NUM
disc_set(:,k)=(1/sqrt(disc_value(k)))*Train_SET*V(:,k);
end

end

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

function [Eigen_Vector,Eigen_Value]=Find_K_Max_Eigen(Matrix,Eigen_NUM)

[NN,NN]=size(Matrix);
[V,S]=eig(Matrix); %Note this is equivalent to; [V,S]=eig(St,SL); also equivalent to [V,S]=eig(Sn,St); %

S=diag(S);
[S,index]=sort(S);

Eigen_Vector=zeros(NN,Eigen_NUM);
Eigen_Value=zeros(1,Eigen_NUM);

p=NN;
for t=1:Eigen_NUM
Eigen_Vector(:,t)=V(:,index(p));
Eigen_Value(t)=S(p);
p=p-1;
end
170 changes: 170 additions & 0 deletions lib/FDDL/utilies/FDDL.m
@@ -0,0 +1,170 @@
function [Dict,Drls,CoefM,CMlabel] = FDDL(TrainDat,TrainLabel,opts)
% ========================================================================
% Fisher Discriminative Dictionary Learning (FDDL), Version 1.0
% Copyright(c) 2011 Meng YANG, Lei Zhang, Xiangchu Feng and David Zhang
% All Rights Reserved.
%
% ----------------------------------------------------------------------
% Permission to use, copy, or modify this software and its documentation
% for educational and research purposes only and without fee is here
% granted, provided that this copyright notice and the original authors'
% names appear on all copies and supporting documentation. This program
% shall not be used, rewritten, or adapted as the basis of a commercial
% software or hardware product without first obtaining permission of the
% authors. The authors make no representations about the suitability of
% this software for any purpose. It is provided "as is" without express
% or implied warranty.
%----------------------------------------------------------------------
%
% This is an implementation of the algorithm for learning the
% Fisher Discriminative Dictionary from a labeled training data
%
% Please refer to the following paper
%
% Meng Yang, Lei Zhang, Xiangchu Feng, and David Zhang,"Fisher Discrimination
% Dictionary Learning for Sparse Representation", In IEEE Int. Conf. on
% Computer Vision, 2011.
%
%----------------------------------------------------------------------
%
%Input : (1) TrainDat: the training data matrix.
% Each column is a training sample
% (2) TrainDabel: the training data labels
% (3) opts : the struture of parameters
% .nClass the number of classes
% .wayInit the way to initialize the dictionary
% .lambda1 the parameter of l1-norm energy of coefficient
% .lambda2 the parameter of l2-norm of Fisher Discriminative
% coefficient term
% .nIter the number of FDDL's iteration
% .show sign value of showing the gap sequence
%
%Output: (1) Dict: the learnt dictionary via FDDL
% (2) Drls: the labels of learnt dictionary's columns
% (2) CoefM: Mean Coefficient Matrix. Each column is a mean coef
% vector
% (3) CMlabel: the labels of CoefM's columns.
%
%-----------------------------------------------------------------------
%
%Usage:
%Given a training data, including TrainDat and TrainLabel, and the
%parameters, opts.
%
%[Dict,CoefM,CMlabel] = FDDL(TrainDat,TrainLabel,opts)
%-----------------------------------------------------------------------
%%%%%%%%%%%%%%%%%%
% normalize energy
%%%%%%%%%%%%%%%%%%
TrainDat = TrainDat*diag(1./sqrt(sum(TrainDat.*TrainDat)));

%%%%%%%%%%%%%%%%%%
%initialize dict
%%%%%%%%%%%%%%%%%%
Dict_ini = [];
Dlabel_ini = [];
for ci = 1:opts.nClass
cdat = TrainDat(:,TrainLabel==ci);
dict = FDDL_INID(cdat,size(cdat,2),opts.wayInit);
Dict_ini = [Dict_ini dict];
Dlabel_ini = [Dlabel_ini repmat(ci,[1 size(dict,2)])];
end

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%initialize coef without between-class scatter
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
ini_par.tau = opts.lambda1;
ini_par.lambda = opts.lambda2;
ini_ipts.D = Dict_ini;
coef = zeros(size(Dict_ini,2),size(TrainDat,2));
if size(Dict_ini,1)>size(Dict_ini,2)
ini_par.c = 1.05*eigs(Dict_ini'*Dict_ini,1);
else
ini_par.c = 1.05*eigs(Dict_ini*Dict_ini',1);
end
for ci = 1:opts.nClass
fprintf(['Initializing Coef: Class ' num2str(ci) '\n']);
ini_ipts.X = TrainDat(:,TrainLabel==ci);
[ini_opts] = FDDL_INIC (ini_ipts,ini_par);
coef(:,TrainLabel ==ci) = ini_opts.A;
end

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%Main loop of Fisher Discriminative Dictionary Learning
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Fish_par.dls = Dlabel_ini;
Fish_ipts.D = Dict_ini;
Fish_ipts.trls = TrainLabel;
Fish_par.tau = opts.lambda1;
Fish_par.lambda2 = opts.lambda2;

Fish_nit = 1;
drls = Dlabel_ini;
while Fish_nit<=opts.nIter
if size(Fish_ipts.D,1)>size(Fish_ipts.D,2)
Fish_par.c = 1.05*eigs(Fish_ipts.D'*Fish_ipts.D,1);
else
Fish_par.c = 1.05*eigs(Fish_ipts.D*Fish_ipts.D',1);
end
%-------------------------
%updating the coefficient
%-------------------------
for ci = 1:opts.nClass
fprintf(['Updating coefficients, class: ' num2str(ci) '\n'])
Fish_ipts.X = TrainDat(:,TrainLabel==ci);
Fish_ipts.A = coef;
Fish_par.index = ci;
[Copts] = FDDL_SpaCoef (Fish_ipts,Fish_par);
coef(:,TrainLabel==ci) = Copts.A;
CMlabel(ci) = ci;
CoefM(:,ci) = mean(Copts.A,2);
end
[GAP_coding(Fish_nit)] = FDDL_FDL_Energy(TrainDat,coef,opts.nClass,Fish_par,Fish_ipts);

%------------------------
%updating the dictionary
%------------------------
for ci = 1:opts.nClass
fprintf(['Updating dictionary, class: ' num2str(ci) '\n']);
[Fish_ipts.D(:,drls==ci),Delt(ci).delet]= FDDL_UpdateDi (TrainDat,coef,...
ci,opts.nClass,Fish_ipts,Fish_par);
end
[GAP_dict(Fish_nit)] = FDDL_FDL_Energy(TrainDat,coef,opts.nClass,Fish_par,Fish_ipts);

newD = []; newdrls = []; newcoef = [];
for ci = 1:opts.nClass
delet = Delt(ci).delet;
if isempty(delet)
classD = Fish_ipts.D(:,drls==ci);
newD = [newD classD];
newdrls = [newdrls repmat(ci,[1 size(classD,2)])];
newcoef = [newcoef; coef(drls==ci,:)];
else
temp = Fish_ipts.D(:,drls==ci);
temp_coef = coef(drls==ci,:);
for temp_i = 1:size(temp,2)
if sum(delet==temp_i)==0
newD = [newD temp(:,temp_i)];
newdrls = [newdrls ci];
newcoef = [newcoef;temp_coef(temp_i,:)];
end
end
end
end

Fish_ipts.D = newD;
coef = newcoef;
drls = newdrls;
Fish_par.dls = drls;

Fish_nit = Fish_nit +1;
end

Dict = Fish_ipts.D;
Drls = drls;

if opts.show
subplot(1,2,1);plot(GAP_coding,'-*');title('GAP_coding');
subplot(1,2,2);plot(GAP_dict,'-o');title('GAP_dict');
end
return;

0 comments on commit 36ba9c0

Please sign in to comment.