Skip to content

FSDP checkpointing uses deprecated APIs with PyTorch 2.2 #19462

Open
@carmocca

Description

@carmocca

Bug description

See added deprecation warnings in pytorch/pytorch#113867

What version are you seeing the problem on?

v2.2

How to reproduce the bug

Originated from

save_state_dict(converted_state, writer)

We already use the newer API for loading

if _TORCH_GREATER_EQUAL_2_2:
from torch.distributed.checkpoint import load
else:
from torch.distributed.checkpoint import load_state_dict as load # deprecated

Error messages and logs

/home/carlos/nightly-env/lib/python3.10/site-packages/torch/distributed/checkpoint/state_dict_saver.py:31: UserWarning: 'save_state_dict' is deprecated and will be removed in future versions.Please use 'save' instead.
  warnings.warn(

Environment

No response

More info

No response

cc @awaelchli @carmocca

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingcheckpointingRelated to checkpointingstrategy: fsdpFully Sharded Data Parallel

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions