Skip to content

Commit 1ef7503

Browse files
authored
fix (#8265)
1 parent f4a8f4c commit 1ef7503

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

paddlenlp/transformers/llama/modeling_auto.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -854,7 +854,7 @@ def get_layer_pp_info(layer_index):
854854
self.next_pp_stage_indexes = []
855855
for i in range(config.num_hidden_layers):
856856
pp_stage_id, input_need_reshard = get_layer_pp_info(i)
857-
decoder_layers.append(LlamaDecoderLayerAuto(config, False, pp_stage_id))
857+
decoder_layers.append(LlamaDecoderLayerAuto(config, i not in self.no_recompute_layers, pp_stage_id))
858858
if input_need_reshard:
859859
self.next_pp_stage_indexes.append(i)
860860

0 commit comments

Comments
 (0)