Network Alpha
Closely related to the Network Rank (Dimension), the smaller the Network Alpha value, the larger the LoRA neural net weights. Can be used to Dampen, or "slow down" learning. Alpha of 16 and Network Rank (Dim) of 32 halves the Learning Rate. If Alpha and Network Rank are set to the same value, there will be no effect on Learning Rate.