Skip to content

Commit dd6de12

Browse files
biswaroop1547sayakpaullinoytsaban
committed
[Fix] remove setting lr for T5 text encoder when using prodigy in flux dreambooth lora script (#9473)
* fix: removed setting of text encoder lr for T5 as it's not being tuned * fix: removed setting of text encoder lr for T5 as it's not being tuned --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> Co-authored-by: Linoy Tsaban <57615435+linoytsaban@users.noreply.github.com>
1 parent 2c6c9fc commit dd6de12

File tree

2 files changed

+0
-2
lines changed

2 files changed

+0
-2
lines changed

examples/dreambooth/train_dreambooth_flux.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1288,7 +1288,6 @@ def load_model_hook(models, input_dir):
12881288
# changes the learning rate of text_encoder_parameters_one and text_encoder_parameters_two to be
12891289
# --learning_rate
12901290
params_to_optimize[1]["lr"] = args.learning_rate
1291-
params_to_optimize[2]["lr"] = args.learning_rate
12921291

12931292
optimizer = optimizer_class(
12941293
params_to_optimize,

examples/dreambooth/train_dreambooth_lora_flux.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1370,7 +1370,6 @@ def load_model_hook(models, input_dir):
13701370
# changes the learning rate of text_encoder_parameters_one and text_encoder_parameters_two to be
13711371
# --learning_rate
13721372
params_to_optimize[1]["lr"] = args.learning_rate
1373-
params_to_optimize[2]["lr"] = args.learning_rate
13741373

13751374
optimizer = optimizer_class(
13761375
params_to_optimize,

0 commit comments

Comments
 (0)