Skip to content

akshey-kumar/comparison-algorithms

Repository files navigation

comparison-algorithms

Neuronal Manifold Learning Algorithms

Welcome to the comparison-algorithms repository, where we explore and compare various embedding methods, specifically neuronal manifold techniques. Our novel algorithm, BunDLe-Net, is one of the focal points of this research. For the original implementation of BunDLe-Net, please visit its GitHub repository.

About the Project

This repository is dedicated to evaluating and comparing different embedding methods as documented in the journal article on BunDLe-Net, accessible at https://www.biorxiv.org/content/10.1101/2023.08.08.551978v2.

Repository Structure

Our repository is designed for easy replication of evaluations of embeddings for specific neuronal data and algorithms, eliminating the need to rerun redundant code.

  • The core functions for BunDLe-Net can be found in the functions/ directory.
  • Embeddings generated by various algorithms are produced by running Python scripts such as 1_PCA, 2_autoencoder, and so on. The resulting embeddings are saved in the data/generated/saved_Y/ directory.
  • Evaluations of these saved embeddings are performed using scripts located in the evaluation_scripts/ directory.
  • Within evaluation_scripts/, you'll find scripts like microvariable_evaluation.py, behaviour_decoding.py, and dynamics_predictability.py, which conduct various evaluations.
  • To run all evaluations for every worm and algorithm, you can use the run_evaluations.sh bash script.
  • Finally, all the plots of the evaluation metrics are done in plotting.py

In summary, if you wish to recompute an embedding using a specific method for a given dataset, consult the scripts labeled i_<algorithm>.py, where 'i' ranges from 1 to 7. If you intend to re-evaluate a specific algorithm's embedding for a particular dataset, refer to the code within run_evaluations.sh.

About

Neuronal manifold learning algorithms

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages