DeepSpeed
1e44d48d - Fix potential random layout inconsistency issues in sparse attention modules (#534)

Commit
4 years ago
Fix potential random layout inconsistency issues in sparse attention modules (#534) * 1) Register layout as buffer of module so that we can save/load checkpoint; 2) Add a broadcast of layout at the beginning to ensure different processes will have consistent layout during distributed training. * Add docstring for max_seq_length argument in SparseSelfAttention Co-authored-by: Zhun Liu <zhunliu@microsoft.com> Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Author
Parents
Loading