Optimizer

From Civitai Wiki
Revision as of 07:58, 11 October 2023 by Civitai (talk | contribs) (Created page with "The Optimizer controls how the neural network weights are updated during training. There are various options, and different LoRA guides will suggest different Optimizers and various associated settings. The most commonly used 1.5 Optimizer is AdamW8bit (the default), which uses the least VRAM and has sufficiently good accuracy. Alternatives include DAdaptation, which automatically adjusts the learning rate as trainin...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

The Optimizer controls how the neural network weights are updated during training. There are various options, and different LoRA guides will suggest different Optimizers and various associated settings. The most commonly used 1.5 Optimizer is AdamW8bit (the default), which uses the least VRAM and has sufficiently good accuracy. Alternatives include DAdaptation, which automatically adjusts the learning rate as training progresses, and Adafactor, which incorporates both methods.