Skip to content

Commit

Permalink
Fix typo in posterior function name. Add example in README for LDA so…
Browse files Browse the repository at this point in the history
…lver interface
  • Loading branch information
trthatcher committed Jan 6, 2021
1 parent 7d220c9 commit cb67b70
Show file tree
Hide file tree
Showing 2 changed files with 33 additions and 12 deletions.
43 changes: 32 additions & 11 deletions README.md
Expand Up @@ -3,8 +3,6 @@
[![Build Status](https://travis-ci.org/trthatcher/DiscriminantAnalysis.jl.svg?branch=master)](https://travis-ci.org/trthatcher/DiscriminantAnalysis.jl)
[![Coverage Status](https://coveralls.io/repos/trthatcher/DiscriminantAnalysis.jl/badge.svg?branch=master&service=github)](https://coveralls.io/github/trthatcher/DiscriminantAnalysis.jl?branch=master)

#### Summary

**DiscriminantAnalysis.jl** is a Julia package for multiple linear and quadratic
regularized discriminant analysis (LDA & QDA respectively). LDA and QDA are
distribution-based classifiers with the underlying assumption that data follows
Expand All @@ -14,18 +12,41 @@ covariance matrix whereas QDA relaxes that constraint and allows for distinct
within-class covariance matrices. This results in LDA being a linear classifier
and QDA being a quadratic classifier.

#### Documentation
The package is currently a work in progress work in progress - see [issue #12](https://github.com/trthatcher/DiscriminantAnalysis.jl/issues/12) for the package status.

## Getting Started

A bare-bones implementation of LDA is currently available. The script below demonstrates how to fit an LDA model to some synthetic data using the low-level interface:

```julia
using DiscriminantAnalysis
using Random

Documentation is a work in progress.
const DA = DiscriminantAnalysis

#### Visualization
# Generate two sets of 100 samples of a 5-dimensional random normal
# variable offset by +1/-1
X = [randn(250,5) .- 1;
randn(250,5) .+ 1]

When the data is modelled via linear discriminant analysis, the resulting
classification boundaries are hyperplanes (lines in two dimensions):
# Generate class labels for the two samples
# NOTE: classes must be indexed by integers from 1 to the number of
# classes (2 in this case)
y = repeat(1:2, inner=100)

<p align="center"><img alt="Linear Discriminant Analysis" src="docs/src/assets/lda.png" /></p>
# Set the solver options
dims = 1 # use 1 for row-per-observation; 2 for columns
canonical = true # use true to compute the canonical coords
compute_covariance = false # use true to compute & store covariance
centroids = nothing # supply a precomputed set of class centroids
priors = Float64[1/2; 1/2] # prior class weights
gamma = nothing # gamma parameter

Using quadratic discriminant analysis, the resulting classification boundaries
are quadratics:
# Fit a model
model = DA.LinearDiscriminantModel{Float64}()
DA._fit!(model, y, X, dims, canonical, compute_covariance, centroids, priors, gamma)

<p align="center"><img alt="Quadratic Discriminant Analysis" src="docs/src/assets/qda.png" /></p>
# Get the posterior probabilities for new data
Z = rand(10,5) .- 0.5
Z_prob = DA.posteriors(model, Z)
```
2 changes: 1 addition & 1 deletion src/discriminants.jl
Expand Up @@ -22,7 +22,7 @@ function posteriors!(Π::Matrix{T}, LDA::LinearDiscriminantModel{T}, X::Matrix{T
end


function posteriors!(LDA::LinearDiscriminantModel{T}, X::Matrix{T}) where T
function posteriors(LDA::LinearDiscriminantModel{T}, X::Matrix{T}) where T
Δ = discriminants(LDA, X)
return _posteriors!(Δ, LDA)
end

0 comments on commit cb67b70

Please sign in to comment.