Skip to content

Commit 7f10257

Browse files
RandomGamingDevsayakpaul
authored andcommitted
Added Code for Gradient Accumulation to work for basic_training (#8961)
added line allowing gradient accumulation to work for basic_training example
1 parent 48ad9a9 commit 7f10257

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

docs/source/en/tutorials/basic_training.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -340,6 +340,7 @@ Now you can wrap all these components together in a training loop with 🤗 Acce
340340
... loss = F.mse_loss(noise_pred, noise)
341341
... accelerator.backward(loss)
342342

343+
... if (step + 1) % config.gradient_accumulation_steps == 0:
343344
... accelerator.clip_grad_norm_(model.parameters(), 1.0)
344345
... optimizer.step()
345346
... lr_scheduler.step()

0 commit comments

Comments
 (0)