pytorch
38778add - flash_attention_helper mitigation: pass contiguous inputs (#85135)

Commit
2 years ago
flash_attention_helper mitigation: pass contiguous inputs (#85135) There appears to be a transient issue with respect to non-contiguous inputs in flash_attn and thus we're passing contiguous inputs to mitigate it. Pull Request resolved: https://github.com/pytorch/pytorch/pull/85135 Approved by: https://github.com/drisspg
Author
Committer
Parents
Loading