Skip to content

ShanechiLab/CrossModalDistillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Cross-modal knowledge distillation

Cross-modal knowledge distillation from multi-session spike models to LFP models to develop LFP models with enhanced representational power. Code will be released soon!

Please check out our manuscript for full results and details.

Publication:

Erturk, E., Hashemi, S., Shanechi, M. M. Cross-Modal Representational Knowledge Distillation for Enhanced Spike-informed LFP Modeling. In Advances in Neural Information Processing Systems 2025.

Licence:

Copyright (c) 2025 University of Southern California
See full notice in LICENSE.md
Eray Erturk, Saba Hashemi, and Maryam M. Shanechi
Shanechi Lab, University of Southern California

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published