DoRA (Weight-Decomposed LoRA)

What is DoRA (Weight-Decomposed LoRA)?

DoRA decomposes the pre-trained weight into magnitude and direction components, applying LoRA only to the directional component. This often yields better performance and stability than standard LoRA.

Where did the term "DoRA (Weight-Decomposed LoRA)" come from?

A 2024 advancement in parameter-efficient fine-tuning.

How is "DoRA (Weight-Decomposed LoRA)" used today?

Gaining traction for tasks requiring complex reasoning where standard LoRA might fall short.

Related Terms