Skip to content

fardinabbasi/PCA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Principal Component Analysis

A common technique for image compression is principal component analysis (PCA). Image quality and compression ratio depend on the number of principal components.

Performing PCA on the "emotion_detection" dataset contains 213 images with 6 labels: Happy, Fear, Angry, Disgust, Surprise, and Sad.

How does PCA work?

First, the original data is normalized to have a mean of 0 and a variance of 1.

Next, the normalized data is transformed into another feature space through projection on the eigenvectors (U): $Z = U^T X$.

The procedure for finding U is illustrated below.

Best n_components

The eigenvalues are presented below in descending order.

The elbow method is employed to select the optimal n_components, which involves retaining data projections only onto principal components with significant eigenvalues.

Eigenfaces

Eigenfaces are a set of eigenvectors that are used in facial recognition and image compression.

Below are the first four eigenfaces with the largest eigenvalues.

Below are the last four eigenfaces with the smallest eigenvalues.

About

Machine Learning Course [ECE 501] - Spring 2023 - University of Tehran - Dr. A. Dehaqani, Dr. Tavassolipour

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published