Skip to content

idc9/mvdr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-view dimensionality reduction

A sklearn compatible python package for multi-view dimensionality reduction including multi-view canonical correlation analysis and AJIVE.

Installation

git clone https://github.com/idc9/mvdr.git
python setup.py install

Note the mvdr.ajive assumes you have installed ya_pca which can be found at https://github.com/idc9/ya_pca.git.

Example

from mvdr.mcca.mcca import MCCA
from mvdr.mcca.k_mcca import KMCCA
from mvdr.toy_data.joint_fact_model import sample_joint_factor_model

# sample data from a joint factor model with 3 components
# each data block is X_b = U diag(svals) W_b^T + E_b where
# where the joint scores matrix U and each of the block loadings matrices, W_b, are orthonormal and E_b is a random noise matrix.
Xs, U_true, Ws_true = sample_joint_factor_model()

# fit MCCA (this is the SUMCORR-AVGVAR flavor of multi-CCA)
mcca = MCCA(n_components=3).fit(Xs)

# MCCA with regularization
mcca = MCCA(n_components=3, regs=0.1).fit(Xs) # add regularization

# informative MCCA where we first apply PCA to each data matrix
mcca = MCCA(n_components=3, signal_ranks=[5, 5, 5]).fit(Xs)

# kernel-MCCA
kmcca = KMCCA(n_components=3, regs=.1, kernel='linear')
kmcca.fit(Xs)

Help and support

Additional documentation, examples and code revisions are coming soon. For questions, issues or feature requests please reach out to Iain: idc9@cornell.edu.

Contributing

We welcome contributions to make this a stronger package: data examples, bug fixes, spelling errors, new features, etc.

Citation

DOI

About

A python package for multi-view dimensionality reduction

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages