transformers
d93ef7d7 - Fixes default value of `softmax_scale` in `PhiFlashAttention2`. (#28537)

Commit
1 year ago
Fixes default value of `softmax_scale` in `PhiFlashAttention2`. (#28537) * fix(phi): Phi does not use softmax_scale in Flash-Attention. * chore(docs): Update Phi docs.
Author
Parents
Loading