Skip to content

Latest commit

 

History

History
26 lines (26 loc) · 4.05 KB

incremental-learning.md

File metadata and controls

26 lines (26 loc) · 4.05 KB

Incremental Learning

  • (arXiv 2021.12) Improving Vision Transformers for Incremental Learning, [Paper]
  • (arXiv 2022.03) Meta-attention for ViT-backed Continual Learning, [Paper], [Code]
  • (arXiv 2022.03)Towards Exemplar-Free Continual Learning in Vision Transformers: an Account of Attention, Functional and Weight Regularization, [Paper]
  • (arXiv 2022.07) Online Continual Learning with Contrastive Vision Transformer, [Paper]
  • (arXiv 2022.08) D3Former: Debiased Dual Distilled Transformer for Incremental Learning, [Paper], [Code]
  • (arXiv 2022.10) A Memory Transformer Network for Incremental Learning, [Paper]
  • (arXiv 2023.01) Combined Use of Federated Learning and Image Encryption for Privacy-Preserving Image Classification with Vision Transformer, [Paper]
  • (arXiv 2023.03) Learning to Grow Artificial Hippocampi in Vision Transformers for Resilient Lifelong Learning, [Paper]
  • (arXiv 2023.03) Dense Network Expansion for Class Incremental Learning, [Paper]
  • (arXiv 2023.03) Semantic-visual Guided Transformer for Few-shot Class-incremental Learning, [Paper]
  • (arXiv 2023.04) Continual Detection Transformer for Incremental Object Detection, [Paper]
  • (arXiv 2023.04) Preserving Locality in Vision Transformers for Class Incremental Learning, [Paper]
  • (arXiv 2023.05) BiRT: Bio-inspired Replay in Vision Transformers for Continual Learning, [Paper], [Code]
  • (arXiv 2023.06) TADIL: Task-Agnostic Domain-Incremental Learning through Task-ID Inference using Transformer Nearest-Centroid Embeddings, [Paper]
  • (arXiv 2023.08) On the Effectiveness of LayerNorm Tuning for Continual Learning in Vision Transformers, [Paper], [Code]
  • (arXiv 2023.08) Exemplar-Free Continual Transformer with Convolutions, [Paper], [Projet]
  • (arXiv 2023.08) Introducing Language Guidance in Prompt-based Continual Learning, [Paper]
  • (arXiv 2023.11) CMFDFormer: Transformer-based Copy-Move Forgery Detection with Continual Learning, [Paper]
  • (arXiv 2023.12) Fine-Grained Knowledge Selection and Restoration for Non-Exemplar Class Incremental Learning, [Paper], [Code]
  • (arXiv 2024.01) PL-FSCIL: Harnessing the Power of Prompts for Few-Shot Class-Incremental Learning, [Paper], [Code]
  • (arXiv 2024.01) Dynamic Transformer Architecture for Continual Learning of Multimodal Tasks, [Paper]
  • (arXiv 2024.03) Semantically-Shifted Incremental Adapter-Tuning is A Continual ViTransformer, [Paper], [Code]
  • (arXiv 2024.04) Pre-trained Vision and Language Transformers Are Few-Shot Incremental Learners, [Paper], [Code]
  • (arXiv 2024.04) Calibrating Higher-Order Statistics for Few-Shot Class-Incremental Learning with Pre-trained Vision Transformers, [Paper], [Code]
  • (arXiv 2024.04) Remembering Transformer for Continual Learning, [Paper]