Computational limits of low-rank adaptation (lora) for transformer-based models

Computational limits of low-rank adaptation (lora) for transformer-based models

We study the computational limits of Low-Rank Adaptation (LoRA) update for finetuning transformer-based models using fine-grained complexity theory.

April 2025 · 1 min · Jerry Yao-Chieh Hu, Maojiang Su, En-Jui Kuo, Zhao Song, Han Liu