Skip to content

Commit 22d0489

Browse files
committed
wrap model when lora is ON and only do evaluation.
1 parent ac095f5 commit 22d0489

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

paddlenlp/trainer/trainer.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3123,6 +3123,9 @@ def evaluation_loop(
31233123
if self.model is self.model_wrapped and isinstance(self.model_wrapped, PipelineLayer):
31243124
# NOTE(gongenlei): when do_train=False, do_eval=True, we need to wrap model for pipeline
31253125
self.model_wrapped = fleet.distributed_model(self.model_wrapped)
3126+
if isinstance(self.model_wrapped, LoRAModel) and isinstance(self.model_wrapped.model, PipelineLayer):
3127+
# NOTE(liuting): when do_train=False, do_eval=True, lora=True, we need to wrap model for pipeline
3128+
self.model_wrapped = fleet.distributed_model(self.model_wrapped.model)
31263129
model = self.model_wrapped
31273130
else:
31283131
model = self.model

0 commit comments

Comments
 (0)