accelerate
8ec83c8a - Fix FSDP2 crash with ignored_params on torch < 2.7.0 (#3924)

Commit
37 days ago
Fix FSDP2 crash with ignored_params on torch < 2.7.0 (#3924) The `ignored_params` keyword argument was unconditionally passed to `fully_shard()`, but this parameter is only supported in torch >= 2.7.0. On older versions (e.g. 2.6.0), this caused a TypeError when using FSDP2 with LoRA adapters or any configuration that triggers the fsdp2_prepare_model code path. Move the `ignored_params` kwarg out of the base fsdp2_kwargs dict and only add it conditionally when torch >= 2.7.0 and ignored_modules is actually configured. Fixes #3923
Author
Parents
Loading