[ICLR 2026] Official Implementation for SplitLoRA: Balancing Stability and Plasticity in Continual Learning Through Gradient Space Splitting
Haomiao Qiu1,2, Miao Zhang1*, Ziyue Qiao2*, Weili Guan1, Min Zhang1, Liqiang Nie1
1 Harbin Institute of Technology (Shenzhen)
2 Great Bay University
* Corresponding author
- Paper:
Paper Link - Code Repository:
GitHub
- [05/2025] Initial release
We present SplitLoRA, a method for continual learning that combines orthogonal projection with LoRA. It improves the balance between plasticity and stability by effectively mitigating interference between new and old tasks. This repository provides the official implementation, train and evaluation scripts.
git clone https://github.com/iLearn-Lab/ICLR26-SplitLoRA.git
cd ICLR26-SplitLoRApython -m venv .venv
source .venv/bin/activate # Linux / Mac
# .venv\Scripts\activate # Windowspip install -r requirements.txt- CIFAR-100: https://www.cs.toronto.edu/~kriz/cifar.html
- ImageNet-R: https://github.com/hendrycks/imagenet-r
- DomainNet: https://ai.bu.edu/M3SDA/
Directory structure for three datasets:
DATA_ROOT
|- train
| |- class_folder_1
| | |- image_file_1
| | |- image_file_2
| |- class_folder_2
| |- image_file_2
| |- image_file_3
|- val
|- class_folder_1
| |- image_file_5
| |- image_file_6
|- class_folder_2
|- image_file_7
|- image_file_8
We provide the scripts split_[dataset].py in the tools folder to rearange the directory structure.
Please change the root_dir in each script to the path of the uncompressed dataset.
For three datasets: python reproduce.py
@article{qiu2025splitlora,
title={SplitLoRA: Balancing Stability and Plasticity in Continual Learning Through Gradient Space Splitting},
author={Qiu, Haomiao and Zhang, Miao and Qiao, Ziyue and Guan, Weili and Zhang, Min and Nie, Liqiang},
journal={arXiv preprint arXiv:2505.22370},
year={2025}
}
- Thanks to our supervisor and collaborators for valuable support.
- The code is developed based on https://github.com/zugexiaodui/VPTinNSforCL! We sincerely thank the authors for open-sourcing their code.
This project is released under the Apache License 2.0.