benchmark
ea4433fd - Update the flash-attention submodule (#2500)

Commit
1 year ago
Update the flash-attention submodule (#2500) Summary: We need https://github.com/Dao-AILab/flash-attention/pull/1053/files to externally import `flash_attn_interface` for FA3. Pull Request resolved: https://github.com/pytorch/benchmark/pull/2500 Reviewed By: bertmaher Differential Revision: D64190441 Pulled By: xuzhao9 fbshipit-source-id: ff20f0a28514b645c828853e7f15808ed1597ae6
Author
Parents
Loading