optimizer_config
AdamOptimizerConfig
learning_rate
Initial learning rate to use (after the potential warmup period). Note that in some training pipelines this can
be overriden for a specific group of params: https://pytorch.org/docs/stable/optim.html#per-parameter-options
(E.g. see text_encoder_learning_rate
and unet_learning_rate
)
ProdigyOptimizerConfig
learning_rate
The learning rate. For the Prodigy optimizer, the learning rate is adjusted dynamically. A value of 1.0 is
recommended. Note that in some training pipelines this can be overriden for a specific group of params:
https://pytorch.org/docs/stable/optim.html#per-parameter-options (E.g. see text_encoder_learning_rate
and
unet_learning_rate
)