You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/lightning/fabric/CHANGELOG.md
+14-2Lines changed: 14 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -4,16 +4,28 @@ All notable changes to this project will be documented in this file.
4
4
5
5
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
6
6
7
+
---
8
+
9
+
## [2.5.2] - 2025-3-20
10
+
11
+
### Changed
12
+
13
+
- Ensure correct device is used for autocast when mps is selected as Fabric accelerator ([#20876](https://github.com/Lightning-AI/pytorch-lightning/pull/20876))
14
+
15
+
### Fixed
16
+
17
+
- Fix: `TransformerEnginePrecision` conversion for layers with `bias=False` ([#20805](https://github.com/Lightning-AI/pytorch-lightning/pull/20805))
18
+
7
19
8
20
## [2.5.1] - 2025-03-18
9
21
10
22
### Changed
11
23
12
-
- Added logging support for list of dicts without collapsing to a single key ([#19957](https://github.com/Lightning-AI/pytorch-lightning/issues/19957))
24
+
- Added logging support for list of dicts without collapsing to a single key ([#19957](https://github.com/Lightning-AI/pytorch-lightning/pull/19957))
13
25
14
26
### Removed
15
27
16
-
- Removed legacy support for `lightning run model`. Use `fabric run` instead. ([#20588](https://github.com/Lightning-AI/pytorch-lightning/pull/20588))
28
+
- Removed legacy support for `lightning run model`. Use `fabric run` instead ([#20588](https://github.com/Lightning-AI/pytorch-lightning/pull/20588))
Copy file name to clipboardExpand all lines: src/lightning/pytorch/CHANGELOG.md
+14-27Lines changed: 14 additions & 27 deletions
Original file line number
Diff line number
Diff line change
@@ -4,42 +4,29 @@ All notable changes to this project will be documented in this file.
4
4
5
5
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
6
6
7
+
---
7
8
8
-
## [unreleased] - YYYY-MM-DD
9
-
10
-
### Added
11
-
12
-
- Add enable_autolog_hparams argument to Trainer ([#20593](https://github.com/Lightning-AI/pytorch-lightning/pull/20593))
9
+
## [2.5.2] - 2025-06-20
13
10
11
+
### Changed
14
12
13
+
- Add `enable_autolog_hparams` argument to Trainer ([#20593](https://github.com/Lightning-AI/pytorch-lightning/pull/20593))
15
14
- Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer` ([#20771](https://github.com/Lightning-AI/pytorch-lightning/pull/20771))
16
-
17
-
18
15
- For cross-device local checkpoints, instruct users to install `fsspec>=2025.5.0` if unavailable ([#20780](https://github.com/Lightning-AI/pytorch-lightning/pull/20780))
19
-
20
-
21
-
### Changed
22
-
23
-
-
24
-
25
-
### Removed
26
-
27
-
-
28
-
16
+
- Check param is of `nn.Parameter` type for pruning sanitization ([#20783](https://github.com/Lightning-AI/pytorch-lightning/pull/20783))
29
17
30
18
### Fixed
31
19
32
20
- Fixed `save_hyperparameters` not working correctly with `LightningCLI` when there are parsing links applied on instantiation ([#20777](https://github.com/Lightning-AI/pytorch-lightning/pull/20777))
21
+
- Fixed `logger_connector` has an edge case where step can be a float ([#20692](https://github.com/Lightning-AI/pytorch-lightning/pull/20692))
22
+
- Fixed Synchronize SIGTERM Handling in DDP to Prevent Deadlocks ([#20825](https://github.com/Lightning-AI/pytorch-lightning/pull/20825))
23
+
- Fixed case-sensitive model name ([#20661](https://github.com/Lightning-AI/pytorch-lightning/pull/20661))
- Fix: move `check_inputs` to the target device if available during `to_torchscript` ([#20873](https://github.com/Lightning-AI/pytorch-lightning/pull/20873))
26
+
- Fixed progress bar display to correctly handle iterable dataset and `max_steps` during training ([#20869](https://github.com/Lightning-AI/pytorch-lightning/pull/20869))
27
+
- Fixed problem for silently supporting `jsonnet` ([#20899](https://github.com/Lightning-AI/pytorch-lightning/pull/20899))
33
28
34
29
35
-
- Fixed logger_connector has edge case where step can be a float ([#20692](https://github.com/Lightning-AI/pytorch-lightning/pull/20692))
36
-
37
-
38
-
- Fix: Synchronize SIGTERM Handling in DDP to Prevent Deadlocks ([#20825](https://github.com/Lightning-AI/pytorch-lightning/pull/20825))
39
-
40
-
41
-
---
42
-
43
30
## [2.5.1] - 2025-03-18
44
31
45
32
### Changed
@@ -52,8 +39,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
52
39
53
40
### Fixed
54
41
55
-
- Fixed CSVLogger logging hyperparameter at every write which increase latency ([#20594](https://github.com/Lightning-AI/pytorch-lightning/pull/20594))
56
-
- Fixed OverflowError when resuming from checkpoint with an iterable dataset ([#20565](https://github.com/Lightning-AI/pytorch-lightning/issues/20565))
42
+
- Fixed `CSVLogger` logging hyperparameter at every write which increase latency ([#20594](https://github.com/Lightning-AI/pytorch-lightning/pull/20594))
43
+
- Fixed `OverflowError` when resuming from checkpoint with an iterable dataset ([#20565](https://github.com/Lightning-AI/pytorch-lightning/issues/20565))
57
44
- Fixed swapped _R_co and _P to prevent type error ([#20508](https://github.com/Lightning-AI/pytorch-lightning/issues/20508))
58
45
- Always call `WandbLogger.experiment` first in `_call_setup_hook` to ensure `tensorboard` logs can sync to `wandb` ([#20610](https://github.com/Lightning-AI/pytorch-lightning/pull/20610))
59
46
- Fixed TBPTT example ([#20528](https://github.com/Lightning-AI/pytorch-lightning/pull/20528))
0 commit comments