Optimizer
Jump to navigation
Jump to search
The Optimizer controls how the neural network weights are updated during training. There are various options, and different LoRA guides will suggest different Optimizers and various associated settings. The most commonly used 1.5 Optimizer is AdamW8bit (the default), which uses the least VRAM and has sufficiently good accuracy. Alternatives include DAdaptation, which automatically adjusts the learning rate as training progresses, and Adafactor, which incorporates both methods.