Skip to content

donaldssh/K-Merge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 

Repository files navigation

K-Merge: Online Continual Merging of Adapters for On-device Large Language Models

Donald Shenaj1,2,3  Ondrej Bohdal1  Taha Ceritli1  Mete Ozay1  Pietro Zanuttigh3  Umberto Michieli1 

1 Samsung R&D Institute UK   2 University of Pisa   3 University of Padova

ACL 2026 (main)

website arXiv BibTeX

Paper teaser

Code coming soon!

Abstract

On-device deployment of Large Language Models (LLMs) frequently leverages Low-Rank Adapters (LoRAs) to support diverse downstream tasks under tight resource constraints. To address the limited storage capacity of mobile devices, recent works have explored model merging techniques to fuse multiple LoRAs into a single one. In practice, however, LoRAs are often delivered incrementally, as users request support for new tasks (e.g., novel problem types or languages). This scenario introduces a new challenge: on-device online continual merging, where the objective is to incorporate new LoRAs while preserving the performance on previously supported tasks. In this paper, we propose a data-free and computationally efficient strategy for selecting and merging LoRAs when a new one becomes available, assuming the device can store only a limited number of adapters. Extensive experiments across real-world tasks demonstrate the superiority of our approach compared to alternative strategies while adhering to the storage budget and compute limitations of on-device settings.

Citation

@inproceedings{shenaj2026k,
  title={K-Merge: Online Continual Merging of Adapters for On-device Large Language Models},
  author={Shenaj, Donald and Bohdal, Ondrej and Ceritli, Taha and Ozay, Mete and Zanuttigh, Pietro and Michieli, Umberto},
  booktitle={Proceedings of the 64th Annual Meeting of the Association for Computational Linguistics (ACL),
  year={2026}
}

About

Official repository of "K-Merge: Online Continual Merging of Adapters for On-device Large Language Models" by D. Shenaj, O. Bohdal, T. Ceritli, M. Ozay, P. Zanuttigh and U. Michieli, ACL 2026 (main).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors