Skip to content

tiepvupsu/DICTOL

Repository files navigation

This repo is no longer maintained!

DICTOL - A Discriminative dictionary Learning Toolbox for Classification (MATLAB version).

This Toolbox is a part of our LRSDL project.

See Python version.

Related publications:

  1. Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." to appear in IEEE Transactions on Image Processing. [paper].

  2. Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper].

Author: Tiep Vu

Run DICTOL_demo.m to see example

If you experience any bugs, please let me know via the Issues tab. I'd really appreciate and fix the error ASAP. Thank you.

On this page:

Notation

  • Y: signals. Each column is one observation.
  • D: dictionary.
  • X: sparse coefficient.
  • d: signal dimension. d = size(Y, 1).
  • C: number of classes.
  • c: class index.
  • n_c: number of training samples in class c. Typically, all n_c are the same and equal to n.
  • N: total number of training samples.
  • Y_range: an array storing range of each class, suppose that labels are sorted in a ascending order. Example: If Y_range = [0, 10, 25], then:
    • There are two classes, samples from class 1 range from 1 to 10, from class 2 range from 11 to 25.
    • In general, samples from class c range from Y_range(c) + 1 to Y_range(c+1)
    • We can observe that number of classes C = numel(Y_range) - 1.
  • k_c: number of bases in class-specific dictionary c. Typically, all n_c are the same and equal to k.
  • k_0: number of bases in the shared-dictionary
  • K: total number of dictionary bases.
  • D_range: similar to Y_range but used for dictionary without the shared dictionary.

Sparse Representation-based classification (SRC)

  • Sparse Representation-based classification implementation [1].
  • Classification based on SRC.
  • Syntax: [pred, X] = SRC_pred(Y, D, D_range, opts)
    • INPUT:
      • Y: test samples.
      • D: the total dictionary. D = [D_1, D_2, ..., D_C] with D_c being the c-th class-specific dictionary.
      • D_range: range of class-specific dictionaries in D. See also Notation.
      • opts: options.
    • OUTPUT:
      • pred: predicted labels of test samples.
      • X: solution of the lasso problem.

Online Dictionary Learning (ODL)

  • An implementation of the well-known Online Dictionary Learning method [2].

Cost function

Training ODL

  • Syntax: [D, X] = ODL(Y, k, lambda, opts, sc_method)
    • INPUT:
      • Y: collection of samples.
      • k: number of bases in the desired dictionary.
      • lambda: norm 1 regularization parameter.
      • opts: option.
      • sc_method: sparse coding method used in the sparse coefficient update. Possible values:
        • 'fista': using FISTA algorithm. See also fista.
        • 'spams': using SPAMS toolbox [12].
    • OUTPUT:
      • D, X: as in the problem.

LCKSVD

Check its project page

Dictionary learning with structured incoherence and shared features (DLSI)

  • An implementation of the well-known DLSI method [5].

Cost function

Training DLSI

  • function [D, X, rt] = DLSI(Y, Y_range, opts)
  • The main DLSI algorithm
  • INPUT:
    • Y, Y_range: training samples and their labels
    • opts:
      • opts.lambda, opts.eta: lambda and eta in the cost function
      • opts.max_iter: maximum iterations.
  • OUTPUT:
    • D: the trained dictionary,
    • X: the trained sparse coefficient,
    • rt: total running time of the training process.

DLSI predict new samples

  • function pred = DLSI_pred(Y, D, opts)
  • predict the label of new input Y given the trained dictionary D and parameters stored in opts

Demo

Run DLSI_top in Matlab command window.

Dictionary learning for separating the particularity and the commonality (COPAR)

  • An implementation of COPAR [7].

Cost function

where:

Training COPAR

  • function [D, X, rt] = COPAR(Y, Y_range, opts)

  • INPUT:

    • Y, Y_range: training samples and their labels
    • opts: a struct
      • opts.lambda, opts.eta: lambda and eta in the cost function
      • opts.max_iter: maximum iterations.
  • OUTPUT:

    • D: the trained dictionary,
    • X: the trained sparse coefficient,
    • rt: total running time of the training process.

COPAR predect new samples

  • function pred = COPAR_pred(Y, D, D_range_ext, opts)

  • predict label of the input Y

  • INPUT:

    • Y: test samples
    • D: the trained dictionary
    • D_range_ext: range of class-specific and shared dictionaries in D. The shared dictionary is located at the end of D.
    • opts: a struct of options:
    • opts.classify_mode: a string of classification mode. either 'GC' (global coding) or 'LC' (local coding)
    • opts.lambda, opts.eta, opts.max_iter: as in COPAR.m.
  • OUTPUT:

    • pred: predicted labels of Y.

Demo

Run COPAR_top in the Matlab command window.

LRSDL

  • An implementation of COPAR [8].

Motivation

Cost function

Note that unlike COPAR, in LRSDL, we separate the class-specific dictionaries (D) and the shared dictionary (D_0). The sparse coefficients (X, X^0) are also separated.

Training LRSDL

  • function `[D, D0, X, X0, CoefM, coefM0, opts, rt] = LRSDL(Y, train_label, opts)``

  • INPUT:

    • Y, Y_range: training samples and their labels
    • opts: a struct
      • opts.lambda1, opts.lambda: lambda1 and lambda2 in the cost function,
      • opts.lambda3: eta in the cost function (fix later),
      • opts.max_iter: maximum iterations,
      • opts.D_range: range of the trained dictionary,
      • opts.k0: size of the shared dictionary
  • OUTPUT:

    • D, D0, X, X0: trained matrices as in the cost function,
    • CoefM: the mean matrix. CoefM(:, c) is the mean vector of X_c (mean of columns).
    • CoefM0: the mean vector of X0,
    • rt: total running time (in seconds).

LRSDL predict new samples

See LRSDL_pred_GC.m function

Demo

Run LRSDL_top in the Matlab command window.

Fisher discrimination dictionary learning (FDDL)

  • An implementation of FDDL [4].

Cost function

Simiar to LRSDL cost function without red terms.

Training FDDL

Set opts.k0 = 0 and using LRSDL.m function.

FDDL predect new samples

  • function `pred = FDDL_pred(Y, D, CoefM, opts)``

Discriminative Feature-Oriented dictionary learning (DFDL)

D2L2R2

  • Update later

Fast iterative shrinkage-thresholding algorithm (FISTA)

References

[1]. (SRC)Wright, John, et al. "Robust face recognition via sparse representation." Pattern Analysis and Machine Intelligence, IEEE Transactions on 31.2 (2009): 210-227. paper

[2]. (ODL) Mairal, Julien, et al. "Online learning for matrix factorization and sparse coding." The Journal of Machine Learning Research 11 (2010): 19-60. [paper]

[3]. (LC-KSVD) Jiang, Zhuolin, Zhe Lin, and Larry S. Davis. "Label consistent K-SVD: Learning a discriminative dictionary for recognition." Pattern Analysis and Machine Intelligence, IEEE Transactions on 35.11 (2013): 2651-2664. [Project page]

[4]. (FDDL) Yang, Meng, et al. "Fisher discrimination dictionary learning for sparse representation." Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011. [paper], [code]

[5]. (DLSI)Ramirez, Ignacio, Pablo Sprechmann, and Guillermo Sapiro. "Classification and clustering via dictionary learning with structured incoherence and shared features." Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. IEEE, 2010. [paper]

[6]. (DFDL) Discriminative Feature-Oriented dictionary Learning. Tiep H. Vu, H. S. Mousavi, V. Monga, A. U. Rao and G. Rao, "Histopathological Image Classification using Discriminative Feature-Oriented dictionary Learning", IEEE Transactions on Medical Imaging , volume 35, issue 3, pages 738-751, March 2016. [paper] [Project page]

[7]. (COPAR) Kong, Shu, and Donghui Wang. "A dictionary learning approach for classification: separating the particularity and the commonality." Computer Vision ECCV 2012. Springer Berlin Heidelberg, 2012. 186-199. [paper]

[8]. (LRSDL) Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper]

[9]. A singular value thresholding algorithm for matrix completion." SIAM Journal on Optimization 20.4 (2010): 1956-1982. [paper]

[10]. (FISTA) Beck, Amir, and Marc Teboulle. "A fast iterative shrinkage-thresholding algorithm for linear inverse problems." SIAM journal on imaging sciences 2.1 (2009): 183-202. [paper]

[11]. (SPAMS) The Sparse Modeling Software

About

DICTOL - A Dictionary Learning Toolbox in Matlab and Python

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published