Network Alpha: Difference between revisions
Jump to navigation
Jump to search
(Created page with "Closely related to the Network Rank (Dimension), the smaller the Network Alpha value, the larger the LoRA neural net weights. Can be used to Dampen, or "slow down" learning. Alpha of 16 and Network Rank (Dim) of 32 halves the Learning Rate. If Alpha and Network Rank are set to the same value, there will be no effect on Learning Rate. Category:Training") |
(No difference)
|
Latest revision as of 07:50, 11 October 2023
Closely related to the Network Rank (Dimension), the smaller the Network Alpha value, the larger the LoRA neural net weights. Can be used to Dampen, or "slow down" learning. Alpha of 16 and Network Rank (Dim) of 32 halves the Learning Rate. If Alpha and Network Rank are set to the same value, there will be no effect on Learning Rate.