transformers
07bfd2f8 - [XPU] Add flash_attn2 support for XPU (#41956)

Commit
27 days ago
[XPU] Add flash_attn2 support for XPU (#41956) * Add flash_attention_2 and kernels-community/flash-attn support for XPU * Add flash-attn-2 support for XPU * Delete deterministic algorithm for xpu * Fix code style * Modify repo_id to match the latest kernels-community/flash-attn2 * Fix code style * Update * Make quality * Use kernels loading * Update * Delete invalid import * Update comment --------- Co-authored-by: Anton Vlasjuk <73884904+vasqu@users.noreply.github.com>
Author
Parents
Loading