transformers
d93ef7d7
- Fixes default value of `softmax_scale` in `PhiFlashAttention2`. (#28537)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
Fixes default value of `softmax_scale` in `PhiFlashAttention2`. (#28537) * fix(phi): Phi does not use softmax_scale in Flash-Attention. * chore(docs): Update Phi docs.
References
#28537 - Fixes default value of `softmax_scale` in `PhiFlashAttention2`.
Author
gugarosa
Parents
a6adc05e
Loading