Skip to content

attractorlab/engram-prototype

Repository files navigation

Engram Memory Network: Brain-Inspired Prototype Explanations

Official implementation of "Engram Memory Network: Brain-Inspired Prototype Explanations" (IJCNN 2026, WCCI 2026).

Requirements

pip install -r requirements.txt

Training

First train the backbone (encoder + decoder, without the engram memory module), then train EMN.

Supported datasets: CIFAR10, CIFAR100, CIFAR100SUPER, MNIST, HAM, CUB. Other datasets can be added by extending dataloader.py.

# Train backbone
python run_backbone_train.py --opt deit_small_bb_fr --dataset_name CIFAR10 --number 1

# Train EMN
python run_train.py \
  --dataset_name CIFAR10 \
  --backbone_name deit_small_bb_fr \
  --backbone_path backbone_weights/deit_small_bb_fr/CIFAR10_1 \
  --marker CIFAR10_1

Evaluation

python eval.py --load_model_path logs/CIFAR10_CIFAR10_1 --device_id 0

Computes accuracy, LPIPS (AlexNet/VGG), and DISTS. Results are saved to eval.csv.

Results

Visual Similarity on CIFAR-10 (x10⁻¹, lower is better)

Model LPIPS Top-1 LPIPS Top-3 DISTS Top-1 DISTS Top-3
ProtoVAE 7.50 ± 0.06 7.50 ± 0.07 6.75 ± 0.11 6.76 ± 0.11
ProtoPNet 6.95 ± 0.19 7.00 ± 0.32 4.07 ± 0.13 4.02 ± 0.12
ProtoViT 6.26 ± 0.10 6.31 ± 0.07 3.67 ± 0.03 3.69 ± 0.03
ProtoPFormer 6.23 ± 0.02 6.24 ± 0.02 3.68 ± 0.01 3.69 ± 0.02
EMN (ours) 5.38 ± 0.02 5.50 ± 0.02 3.22 ± 0.01 3.28 ± 0.01

Classification Accuracy (%)

Model CIFAR-10 CIFAR-100 MNIST HAM10000 CUB-200
ProtoVAE 81.59 50.37 99.05
ProtoPNet 94.30 68.63 99.06 62.92 22.43
ProtoViT 97.63 86.41 99.09 80.49 74.09
ProtoPFormer 97.85 87.83 99.41 66.95 50.51
EMN (ours) 96.39 85.28 99.19 88.51 68.74

Subclass Alignment on CIFAR-100S (%, higher is better)

Model Top-1 Top-3
ProtoPNet 2.28 ± 0.91 1.95 ± 0.47
ProtoViT 15.93 ± 6.55 14.03 ± 7.04
ProtoPFormer 19.49 ± 1.60 19.42 ± 1.38
EMN (ours) 39.87 ± 0.72 36.07 ± 0.36

Project Structure

├── run_train.py              # EMN training entry point
├── run_backbone_train.py     # Backbone training entry point
├── train.py                  # EMN training loop
├── eval.py                   # Evaluation (accuracy, LPIPS, DISTS)
├── dataloader.py             # Dataset loading
├── requirements.txt
├── args/
│   └── get_args.py           # Argument parser
├── emn/
│   └── emn.py                # EMN model (Eq. 1-4)
├── backbone/
│   ├── utils.py              # Encoder/decoder factory
│   ├── backbone_train.py     # Backbone training loop
│   ├── backbone_model.py     # Backbone model (encoder+decoder)
│   ├── encoder.py            # DeiT-Small encoder
│   ├── decoder.py            # ViT decoder
│   └── module.py             # Transformer blocks
├── visualize/                # Explanation visualization
└── assets/                   # Figures for README

Acknowledgements

Citation

@inproceedings{emn2026,
  title={Engram Memory Network: Brain-Inspired Prototype Explanations},
  author={Kim, Hyunjun and Ha, Myoung Hoon},
  booktitle={International Joint Conference on Neural Networks (IJCNN)},
  year={2026}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages