paper-conference

Zeroth-Order Fine-Tuning of LLMs with Transferable Static Sparsity
Zeroth-order optimization (ZO) is a memory-efficient strategy for fine-tuning Large Language Models using only forward passes. However, …
Zeroth-Order Fine-Tuning of LLMs with Transferable Static Sparsity